From d4b1ac94a0294315ba949e3e2de4f472249f3fdc Mon Sep 17 00:00:00 2001 From: William Falcon Date: Tue, 13 Aug 2019 13:03:39 -0400 Subject: [PATCH] updated docs --- docs/LightningModule/RequiredTrainerInterface.md | 3 +++ 1 file changed, 3 insertions(+) diff --git a/docs/LightningModule/RequiredTrainerInterface.md b/docs/LightningModule/RequiredTrainerInterface.md index f1f862e802..f532e29e0e 100644 --- a/docs/LightningModule/RequiredTrainerInterface.md +++ b/docs/LightningModule/RequiredTrainerInterface.md @@ -185,6 +185,9 @@ def configure_optimizers(self) Set up as many optimizers and (optionally) learning rate schedulers as you need. Normally you'd need one. But in the case of GANs or something more esoteric you might have multiple. Lightning will call .backward() and .step() on each one in every epoch. If you use 16 bit precision it will also handle that. +**Note:** If you use multiple optimizers, training_step will have an additional ```optimizer_idx``` parameter. + + ##### Return List or Tuple - List of optimizers with an optional second list of learning-rate schedulers