From 42a4fe06b04455d543b7de305af715422a5544c9 Mon Sep 17 00:00:00 2001 From: William Falcon Date: Mon, 12 Oct 2020 12:15:33 -0400 Subject: [PATCH] docs (#4096) --- docs/source/cloud_training.rst | 29 +++++++++++++++++++++++++++++ docs/source/hyperparameters.rst | 31 ------------------------------- docs/source/index.rst | 1 + 3 files changed, 30 insertions(+), 31 deletions(-) create mode 100644 docs/source/cloud_training.rst diff --git a/docs/source/cloud_training.rst b/docs/source/cloud_training.rst new file mode 100644 index 0000000000..9fef417da7 --- /dev/null +++ b/docs/source/cloud_training.rst @@ -0,0 +1,29 @@ +################ +AWS/GCP training +################ +Lightning has a native solution for training on AWS/GCP at scale (Lightning-Grid). +Grid is in private early-access now but you can request access at `grid.ai `_. + +We've designed Grid to work for Lightning users without needing to make ANY changes to their code. + +To use grid, take your regular command: + +.. code-block:: bash + + python my_model.py --learning_rate 1e-6 --layers 2 --gpus 4 + +And change it to use the grid train command: + +.. code-block:: bash + + grid train --grid_gpus 4 my_model.py --learning_rate 'uniform(1e-6, 1e-1, 20)' --layers '[2, 4, 8, 16]' + +The above command will launch (20 * 4) experiments each running on 4 GPUs (320 GPUs!) - by making ZERO changes to +your code. + +The `uniform` command is part of our new expressive syntax which lets you construct hyperparameter combinations +using over 20+ distributions, lists, etc. Of course, you can also configure all of this using yamls which +can be dynamically assembled at runtime. + + +.. hint:: Grid supports the search strategy of your choice! (and much more than just sweeps) \ No newline at end of file diff --git a/docs/source/hyperparameters.rst b/docs/source/hyperparameters.rst index e85a64e58d..4d1d7ee455 100644 --- a/docs/source/hyperparameters.rst +++ b/docs/source/hyperparameters.rst @@ -12,37 +12,6 @@ Hyperparameters Lightning has utilities to interact seamlessly with the command line ``ArgumentParser`` and plays well with the hyperparameter optimization framework of your choice. ------------- - -Lightning-Grid --------------- -Lightning has a native solution for doing sweeps and training models at scale called Lightning-Grid. -Grid lets you launch sweeps from your laptop on the cloud provider of your choice. We've designed Grid to -work for Lightning users without needing to make ANY changes to their code. - -To use grid, take your regular command: - -.. code-block:: bash - - python my_model.py --learning_rate 1e-6 --layers 2 --gpus 4 - -And change it to use the grid train command: - -.. code-block:: bash - - grid train --grid_gpus 4 my_model.py --learning_rate 'uniform(1e-6, 1e-1, 20)' --layers '[2, 4, 8, 16]' - -The above command will launch (20 * 4) experiments each running on 4 GPUs (320 GPUs!) - by making ZERO changes to -your code. - -The `uniform` command is part of our new expressive syntax which lets you construct hyperparameter combinations -using over 20+ distributions, lists, etc. Of course, you can also configure all of this using yamls which -can be dynamically assembled at runtime. - -Grid is in private early-access now but you can request access at `grid.ai `_. - -.. hint:: Grid supports the search strategy of your choice! (and much more than just sweeps) - ---------- ArgumentParser diff --git a/docs/source/index.rst b/docs/source/index.rst index 9f74884d4d..071f5ea04d 100644 --- a/docs/source/index.rst +++ b/docs/source/index.rst @@ -82,6 +82,7 @@ PyTorch Lightning Documentation :name: Common Use Cases :caption: Common Use Cases + cloud_training amp slurm child_modules