Add trainer flag step [ci skip] (#4147)

* Add trainer flag step

Add step to disable automatic optimization in the trainer
@justusschock

* Apply suggestions from code review

Co-authored-by: Nicki Skafte <skaftenicki@gmail.com>

Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
Co-authored-by: Nicki Skafte <skaftenicki@gmail.com>
This commit is contained in:
Paul 2020-10-14 19:14:40 +02:00 committed by GitHub
parent 2232d37225
commit 051d9b1d94
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
1 changed files with 3 additions and 3 deletions

View File

@ -20,9 +20,9 @@ Manual optimization
===================
For advanced research topics like reinforcement learning, sparse coding, or GAN research, it may be desirable
to manually manage the optimization process. To do so, do the following:
* Ignore the optimizer_idx argument
* So we can scale the loss automatically for you use self.manual_backward(loss) instead of loss.backward()
* Disable automatic optimization in Trainer: Trainer(automatic_optimization=False)
* Drop or ignore the optimizer_idx argument
* Use `self.manual_backward(loss)` instead of `loss.backward()` to automatically scale your loss
.. code-block:: python