From 0d5f7671e04d4a52f45c34a02c236a90d91f840e Mon Sep 17 00:00:00 2001 From: William Falcon Date: Fri, 6 Mar 2020 14:53:27 -0500 Subject: [PATCH] updated docs --- docs/source/hyperparameters.rst | 15 +++++++++++++++ docs/source/introduction_guide.rst | 21 +++++++++++++++++---- 2 files changed, 32 insertions(+), 4 deletions(-) diff --git a/docs/source/hyperparameters.rst b/docs/source/hyperparameters.rst index ef0c9ce3ae..ea58b9e6eb 100644 --- a/docs/source/hyperparameters.rst +++ b/docs/source/hyperparameters.rst @@ -55,6 +55,21 @@ Now we can parametrize the LightningModule. .. note:: Bonus! if (hparams) is in your module, Lightning will save it into the checkpoint and restore your model using those hparams exactly. +And we can also add all the flags available in the Trainer to the Argparser. + +.. code-block:: python + + # add all the available Trainer options to the ArgParser + parser = pl.Trainer.add_argparse_args(parser) + args = parser.parse_args() + +And now you can start your program with + +.. code-block:: bash + + # now you can use any trainer flag + $ python main.py --num_nodes 2 --gpus 8 + Trainer args ^^^^^^^^^^^^ diff --git a/docs/source/introduction_guide.rst b/docs/source/introduction_guide.rst index 3fcf718955..d78f5381a7 100644 --- a/docs/source/introduction_guide.rst +++ b/docs/source/introduction_guide.rst @@ -578,7 +578,7 @@ Notice the epoch is MUCH faster! Hyperparameters --------------- Normally, we don't hard-code the values to a model. We usually use the command line to -modify the network. The `Trainer` can add all the available options to an ArgumentParser. +modify the network. .. code-block:: python @@ -591,9 +591,6 @@ modify the network. The `Trainer` can add all the available options to an Argume parser.add_argument('--layer_2_dim', type=int, default=256) parser.add_argument('--batch_size', type=int, default=64) - # add all the available options to the trainer - parser = pl.Trainer.add_argparse_args(parser) - args = parser.parse_args() Now we can parametrize the LightningModule. @@ -626,6 +623,22 @@ Now we can parametrize the LightningModule. .. note:: Bonus! if (hparams) is in your module, Lightning will save it into the checkpoint and restore your model using those hparams exactly. +And we can also add all the flags available in the Trainer to the Argparser. + +.. code-block:: python + + # add all the available Trainer options to the ArgParser + parser = pl.Trainer.add_argparse_args(parser) + args = parser.parse_args() + +And now you can start your program with + +.. code-block:: bash + + # now you can use any trainer flag + $ python main.py --num_nodes 2 --gpus 8 + + For a full guide on using hyperparameters, `check out the hyperparameters docs `_. ---------