28e18881a9
Co-authored-by: rohitgr7 <rohitgr1998@gmail.com> |
||
---|---|---|
.. | ||
app_argparse | ||
app_boring | ||
app_commands_and_api | ||
app_components | ||
app_dag | ||
app_drive | ||
app_hpo | ||
app_layout | ||
app_multi_node | ||
app_payload | ||
app_pickle_or_not | ||
app_template_streamlit_ui | ||
app_v0 | ||
convert_from_pt_to_pl | ||
pl_basics | ||
pl_bug_report | ||
pl_domain_templates | ||
pl_fault_tolerant | ||
pl_hpu | ||
pl_integrations | ||
pl_ipu | ||
pl_loops | ||
pl_servable_module | ||
README.md | ||
run_ddp_examples.sh | ||
run_pl_examples.sh | ||
test_pl_examples.py |
README.md
Examples
Our most robust examples showing all sorts of implementations can be found in our sister library Lightning Bolts.
Note that some examples may rely on new features that are only available in the development branch and may be incompatible with any releases.
If you see any errors, you might want to consider switching to a version tag you would like to run examples with.
For example, if you're using pytorch-lightning==1.6.4
in your environment and seeing issues, run examples of the tag 1.6.4.
MNIST Examples
5 MNIST examples showing how to gradually convert from pure PyTorch to PyTorch Lightning.
The transition through LightningLite from pure PyTorch is optional, but it might be helpful to learn about it.
- MNIST with vanilla PyTorch
- MNIST with LightningLite
- MNIST LightningLite to LightningModule
- MNIST with LightningModule
- MNIST with LightningModule + LightningDataModule
Basic Examples
In this folder, we have 2 simple examples:
- Image Classifier (trains arbitrary datasets with arbitrary backbones).
- Image Classifier + DALI (defines the model inside the
LightningModule
). - Autoencoder
Domain Examples
This folder contains older examples. You should instead use the examples in Lightning Bolts for advanced use cases.
Basic Examples
In this folder, we have 1 simple example:
- Image Classifier + DALI (defines the model inside the
LightningModule
).
Loop examples
Contains implementations leveraging loop customization to enhance the Trainer with new optimization routines.
- K-fold Cross Validation Loop: Implementation of cross validation in a loop and special datamodule.
- Yield Loop: Enables yielding from the training_step like in a Python generator. Useful for automatic optimization with multiple optimizers.