docs dpp warn (#1835)

* add warn

* Apply suggestions from code review
This commit is contained in:
Jirka Borovec 2020-05-14 17:06:03 +02:00 committed by GitHub
parent 88f816ed06
commit 236c1378f9
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
2 changed files with 6 additions and 2 deletions

View File

@ -18,7 +18,9 @@ after each processed batch and the corresponding loss is logged. The result of
this is a `lr` vs. `loss` plot that can be used as guidance for choosing a optimal this is a `lr` vs. `loss` plot that can be used as guidance for choosing a optimal
initial lr. initial lr.
.. warning:: For the moment, this feature only works with models having a single optimizer. Warnings:
- For the moment, this feature only works with models having a single optimizer.
- LR support for DDP is not implemented yet, it is comming soon.
Using Lightnings build-in LR finder Using Lightnings build-in LR finder
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

View File

@ -71,7 +71,7 @@ a binary search.
.. warning:: .. warning::
Due to these contrains, this features does *NOT* work when passing dataloaders directly Due to these constraints, this features does *NOT* work when passing dataloaders directly
to `.fit()`. to `.fit()`.
The scaling algorithm has a number of parameters that the user can control by The scaling algorithm has a number of parameters that the user can control by
@ -107,3 +107,5 @@ The algorithm in short works by:
.. autoclass:: pytorch_lightning.trainer.training_tricks.TrainerTrainingTricksMixin .. autoclass:: pytorch_lightning.trainer.training_tricks.TrainerTrainingTricksMixin
:members: scale_batch_size :members: scale_batch_size
:noindex: :noindex:
.. warning:: Batch size finder is not supported for DDP yet, it is coming soon.