This commit is contained in:
William Falcon 2020-10-12 12:15:33 -04:00 committed by GitHub
parent 4b117165fc
commit 42a4fe06b0
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
3 changed files with 30 additions and 31 deletions

View File

@ -0,0 +1,29 @@
################
AWS/GCP training
################
Lightning has a native solution for training on AWS/GCP at scale (Lightning-Grid).
Grid is in private early-access now but you can request access at `grid.ai <https://www.grid.ai/>`_.
We've designed Grid to work for Lightning users without needing to make ANY changes to their code.
To use grid, take your regular command:
.. code-block:: bash
python my_model.py --learning_rate 1e-6 --layers 2 --gpus 4
And change it to use the grid train command:
.. code-block:: bash
grid train --grid_gpus 4 my_model.py --learning_rate 'uniform(1e-6, 1e-1, 20)' --layers '[2, 4, 8, 16]'
The above command will launch (20 * 4) experiments each running on 4 GPUs (320 GPUs!) - by making ZERO changes to
your code.
The `uniform` command is part of our new expressive syntax which lets you construct hyperparameter combinations
using over 20+ distributions, lists, etc. Of course, you can also configure all of this using yamls which
can be dynamically assembled at runtime.
.. hint:: Grid supports the search strategy of your choice! (and much more than just sweeps)

View File

@ -12,37 +12,6 @@ Hyperparameters
Lightning has utilities to interact seamlessly with the command line ``ArgumentParser``
and plays well with the hyperparameter optimization framework of your choice.
------------
Lightning-Grid
--------------
Lightning has a native solution for doing sweeps and training models at scale called Lightning-Grid.
Grid lets you launch sweeps from your laptop on the cloud provider of your choice. We've designed Grid to
work for Lightning users without needing to make ANY changes to their code.
To use grid, take your regular command:
.. code-block:: bash
python my_model.py --learning_rate 1e-6 --layers 2 --gpus 4
And change it to use the grid train command:
.. code-block:: bash
grid train --grid_gpus 4 my_model.py --learning_rate 'uniform(1e-6, 1e-1, 20)' --layers '[2, 4, 8, 16]'
The above command will launch (20 * 4) experiments each running on 4 GPUs (320 GPUs!) - by making ZERO changes to
your code.
The `uniform` command is part of our new expressive syntax which lets you construct hyperparameter combinations
using over 20+ distributions, lists, etc. Of course, you can also configure all of this using yamls which
can be dynamically assembled at runtime.
Grid is in private early-access now but you can request access at `grid.ai <https://www.grid.ai/>`_.
.. hint:: Grid supports the search strategy of your choice! (and much more than just sweeps)
----------
ArgumentParser

View File

@ -82,6 +82,7 @@ PyTorch Lightning Documentation
:name: Common Use Cases
:caption: Common Use Cases
cloud_training
amp
slurm
child_modules