lightning/notebooks
Akihiro Nitta 9876df16a2
[docs] Update Bolts link (#6743)
* Update Bolts link

* Update Bolts link

* formt

Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
2021-03-30 22:52:59 +05:30
..
01-mnist-hello-world.ipynb fix docs links (#6057) 2021-02-18 18:40:07 +00:00
02-datamodules.ipynb fix docs links (#6057) 2021-02-18 18:40:07 +00:00
03-basic-gan.ipynb fix docs links (#6057) 2021-02-18 18:40:07 +00:00
04-transformers-text-classification.ipynb argparse: Add use_argument_group=True (#6088) 2021-03-11 10:50:49 -05:00
05-trainer-flags-overview.ipynb Improved EarlyStopping.patience documentation (#6278) 2021-03-02 15:01:07 +05:30
06-mnist-tpu-training.ipynb update xla version (#6464) 2021-03-12 10:04:47 +08:00
07-cifar10-baseline.ipynb [docs] Update Bolts link (#6743) 2021-03-30 22:52:59 +05:30
README.md Add TPU example (#5109) 2021-01-06 11:54:54 +01:00

README.md

Lightning Notebooks

Official Notebooks

You can easily run any of the official notebooks by clicking the 'Open in Colab' links in the table below 😄

Notebook Description Colab Link
MNIST Hello World Train your first Lightning Module on the classic MNIST Handwritten Digits Dataset. Open In Colab
Datamodules Learn about DataModules and train a dataset-agnostic model on MNIST and CIFAR10. Open In Colab
GAN Train a GAN on the MNIST Dataset. Learn how to use multiple optimizers in Lightning. Open In Colab
BERT Fine-tune HuggingFace Transformers models on the GLUE Benchmark Open In Colab
Trainer Flags Overview of the available Lightning Trainer flags Open In Colab
TPU Training Train a model on MNIST using TPUs with Lightning Open In Colab
94% Baseline CIFAR10 Establish a quick baseline of ~94% accuracy on CIFAR10 using Resnet in Lightning Open In Colab