From ebbd1169d838b44e98cef5cc4d0f1f0860fc140f Mon Sep 17 00:00:00 2001 From: Jakub Kaczmarzyk Date: Thu, 4 May 2023 07:35:05 -0400 Subject: [PATCH] add note in LightningCLI docs that `--optimizer` must be given for `--lr_scheduler` to work (#17552) --- docs/source-pytorch/cli/lightning_cli_intermediate_2.rst | 8 +++++--- 1 file changed, 5 insertions(+), 3 deletions(-) diff --git a/docs/source-pytorch/cli/lightning_cli_intermediate_2.rst b/docs/source-pytorch/cli/lightning_cli_intermediate_2.rst index 166f21ebd8..db06f85c35 100644 --- a/docs/source-pytorch/cli/lightning_cli_intermediate_2.rst +++ b/docs/source-pytorch/cli/lightning_cli_intermediate_2.rst @@ -193,13 +193,15 @@ Standard learning rate schedulers from ``torch.optim.lr_scheduler`` work out of .. code:: bash - python main.py fit --lr_scheduler CosineAnnealingLR + python main.py fit --optimizer=Adam --lr_scheduler CosineAnnealingLR + +Please note that ``--optimizer`` must be added for ``--lr_scheduler`` to have an effect. If the scheduler you want needs other arguments, add them via the CLI (no need to change your code)! .. code:: bash - python main.py fit --lr_scheduler=ReduceLROnPlateau --lr_scheduler.monitor=epoch + python main.py fit --optimizer=Adam --lr_scheduler=ReduceLROnPlateau --lr_scheduler.monitor=epoch Furthermore, any custom subclass of ``torch.optim.lr_scheduler.LRScheduler`` can be used as learning rate scheduler: @@ -224,7 +226,7 @@ Now you can choose between any learning rate scheduler at runtime: .. code:: bash # LitLRScheduler - python main.py fit --lr_scheduler LitLRScheduler + python main.py fit --optimizer=Adam --lr_scheduler LitLRScheduler ----