From d7f567126b5bfb8a61a08282a83ddf5d9a4141f4 Mon Sep 17 00:00:00 2001 From: Nrupatunga Date: Wed, 14 Oct 2020 23:13:58 +0530 Subject: [PATCH] [Doc] Lightning Module (#4123) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit * make current_epoch and global_step to be same as trainer, after model restore. * remove assignment here * test * minor modification * merge with parent's master * doc fix / improve * doc fix! Co-authored-by: Adrian Wälchli --- docs/source/new-project.rst | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/source/new-project.rst b/docs/source/new-project.rst index b2e527eb01..1f051ca183 100644 --- a/docs/source/new-project.rst +++ b/docs/source/new-project.rst @@ -120,7 +120,7 @@ Step 1: Define LightningModule optimizer = torch.optim.Adam(self.parameters(), lr=1e-3) return optimizer -**SYTEM VS MODEL** +**SYSTEM VS MODEL** A :ref:`lightning_module` defines a *system* not a model. @@ -143,7 +143,7 @@ Under the hood a LightningModule is still just a :class:`torch.nn.Module` that g - The Train loop - The Validation loop - The Test loop -- The Model + system architecture +- The Model or system of Models - The Optimizer You can customize any part of training (such as the backward pass) by overriding any