[Doc] Lightning Module (#4123)

* make current_epoch and global_step to be same as trainer, after model restore.

* remove assignment here

* test

* minor modification

* merge with parent's master

* doc fix / improve

* doc fix!

Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
This commit is contained in:
Nrupatunga 2020-10-14 23:13:58 +05:30 committed by GitHub
parent 051d9b1d94
commit d7f567126b
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
1 changed files with 2 additions and 2 deletions

View File

@ -120,7 +120,7 @@ Step 1: Define LightningModule
optimizer = torch.optim.Adam(self.parameters(), lr=1e-3)
return optimizer
**SYTEM VS MODEL**
**SYSTEM VS MODEL**
A :ref:`lightning_module` defines a *system* not a model.
@ -143,7 +143,7 @@ Under the hood a LightningModule is still just a :class:`torch.nn.Module` that g
- The Train loop
- The Validation loop
- The Test loop
- The Model + system architecture
- The Model or system of Models
- The Optimizer
You can customize any part of training (such as the backward pass) by overriding any