docs
This commit is contained in:
parent
a0200e6a69
commit
db7997c38c
|
@ -4,8 +4,6 @@ Community Examples
|
|||
- `Contextual Emotion Detection (DoubleDistilBert) <https://github.com/PyTorchLightning/emotion_transformer>`_.
|
||||
- `Cotatron: Transcription-Guided Speech Encoder <https://github.com/mindslab-ai/cotatron>`_.
|
||||
- `FasterRCNN object detection + Hydra <https://github.com/PyTorchLightning/wheat>`_.
|
||||
- `Hyperparameter optimization with Optuna <https://github.com/optuna/optuna/blob/master/examples/pytorch_lightning_simple.py>`_.
|
||||
- `Hyperparameter optimization with Ray Tune <https://docs.ray.io/en/master/tune/tutorials/tune-pytorch-lightning.html>`_.
|
||||
- `Image Inpainting using Partial Convolutions <https://github.com/ryanwongsa/Image-Inpainting>`_.
|
||||
- `MNIST on TPU <https://colab.research.google.com/drive/1-_LKx4HwAxl5M6xPJmqAAu444LTDQoa3#scrollTo=BHBz1_AnamN_>`_.
|
||||
- `NER (transformers, TPU) <https://colab.research.google.com/drive/1dBN-wwYUngLYVt985wGs_OKPlK_ANB9D>`_.
|
||||
|
|
|
@ -12,6 +12,37 @@ Hyperparameters
|
|||
Lightning has utilities to interact seamlessly with the command line ``ArgumentParser``
|
||||
and plays well with the hyperparameter optimization framework of your choice.
|
||||
|
||||
------------
|
||||
|
||||
Lightning-Grid
|
||||
--------------
|
||||
Lightning has a native solution for doing sweeps and training models at scale called Lightning-Grid.
|
||||
Grid lets you launch sweeps from your laptop on the cloud provider of your choice. We've designed Grid to
|
||||
work for Lightning users without needing to make ANY changes to their code.
|
||||
|
||||
To use grid, take your regular command:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
python my_model.py --learning_rate 1e-6 --layers 2 --gpus 4
|
||||
|
||||
And change it to use the grid train command:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
grid train --grid_gpus 4 my_model.py --learning_rate 'uniform(1e-6, 1e-1, 20)' --layers '[2, 4, 8, 16]'
|
||||
|
||||
The above command will launch (20 * 4) experiments each running on 4 GPUs (320 GPUs!) - by making ZERO changes to
|
||||
your code.
|
||||
|
||||
The `uniform` command is part of our new expressive syntax which lets you construct hyperparameter combinations
|
||||
using over 20+ distributions, lists, etc. Of course, you can also configure all of this using yamls which
|
||||
can be dynamically assembled at runtime.
|
||||
|
||||
Grid is in private early-access now but you can request access at `grid.ai <https://www.grid.ai/>`_.
|
||||
|
||||
.. hint:: Grid supports the search strategy of your choice! (and much more than just sweeps)
|
||||
|
||||
----------
|
||||
|
||||
ArgumentParser
|
||||
|
@ -291,14 +322,3 @@ and now we can train MNIST or the GAN using the command line interface!
|
|||
|
||||
$ python main.py --model_name gan --encoder_layers 24
|
||||
$ python main.py --model_name mnist --layer_1_dim 128
|
||||
|
||||
----------
|
||||
|
||||
Hyperparameter Optimization
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
Lightning is fully compatible with the hyperparameter optimization libraries!
|
||||
Here are some useful ones:
|
||||
|
||||
- `Hydra <https://medium.com/pytorch/hydra-a-fresh-look-at-configuration-for-machine-learning-projects-50583186b710>`_
|
||||
- `Optuna <https://github.com/optuna/optuna/blob/master/examples/pytorch_lightning_simple.py>`_
|
||||
- `Ray Tune <https://docs.ray.io/en/master/tune/tutorials/tune-pytorch-lightning.html>`_
|
||||
|
|
|
@ -49,7 +49,7 @@ Or conda.
|
|||
|
||||
conda install pytorch-lightning -c conda-forge
|
||||
|
||||
|
||||
-------------
|
||||
|
||||
The research
|
||||
============
|
||||
|
@ -486,7 +486,7 @@ Once your training starts, you can view the logs by using your favorite logger o
|
|||
tensorboard --logdir ./lightning_logs
|
||||
|
||||
|
||||
Which will generate automatic tensorboard logs.
|
||||
Which will generate automatic tensorboard logs (or with the logger of your choice).
|
||||
|
||||
.. figure:: /_images/mnist_imgs/mnist_tb.png
|
||||
:alt: mnist CPU bar
|
||||
|
|
|
@ -645,8 +645,8 @@ Lightning has many tools for debugging. Here is an example of just a few of them
|
|||
|
||||
.. code-block:: python
|
||||
|
||||
# run validation every 20% of a training epoch
|
||||
trainer = pl.Trainer(val_check_interval=0.2)
|
||||
# run validation every 25% of a training epoch
|
||||
trainer = pl.Trainer(val_check_interval=0.25)
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
|
|
Loading…
Reference in New Issue