lightning/pl_examples
Kushashwa Ravi Shrimali e8ec547933
Minor fix with data_path flag to imagenet script (examples) (#12637)
2022-04-06 17:24:50 +00:00
..
basic_examples fix val accuracy printing in lite example (#12632) 2022-04-06 22:31:12 +05:30
bug_report Update Slack link (#12460) 2022-03-28 19:27:51 +02:00
domain_templates Minor fix with data_path flag to imagenet script (examples) (#12637) 2022-04-06 17:24:50 +00:00
fault_tolerant Fault Tolerant: move signal to SIGTERM (#10605) 2021-11-26 13:37:27 +00:00
hpu_examples/simple_mnist Add support for Habana accelerator (HPU) (#11808) 2022-03-25 10:24:52 +00:00
integration_examples Lightning Lite Examples (#9987) 2021-11-02 08:04:29 +00:00
ipu_examples Update old device flags (#12471) 2022-03-28 16:44:59 +02:00
loop_examples fix typos (#11937) 2022-02-16 17:27:51 -08:00
README.md fix typos (#11937) 2022-02-16 17:27:51 -08:00
__init__.py Replace `yapf` with `black` (#7783) 2021-07-26 13:37:35 +02:00
run_examples.sh Fault Tolerant: move signal to SIGTERM (#10605) 2021-11-26 13:37:27 +00:00
test_examples.py Update old device flags (#12471) 2022-03-28 16:44:59 +02:00

README.md

Examples

Our most robust examples showing all sorts of implementations can be found in our sister library Lightning Bolts.


MNIST Examples

5 MNIST examples showing how to gradually convert from pure PyTorch to PyTorch Lightning.

The transition through LightningLite from pure PyTorch is optional but it might be helpful to learn about it.


Basic Examples

In this folder, we have 2 simple examples:


Domain Examples

This folder contains older examples. You should instead use the examples in Lightning Bolts for advanced use cases.


Basic Examples

In this folder, we have 1 simple example:


Loop examples

Contains implementations leveraging loop customization to enhance the Trainer with new optimization routines.

  • K-fold Cross Validation Loop: Implementation of cross validation in a loop and special datamodule.
  • Yield Loop: Enables yielding from the training_step like in a Python generator. Useful for automatic optimization with multiple optimizers.