[Doc] Lightning Module (#4123)
* make current_epoch and global_step to be same as trainer, after model restore. * remove assignment here * test * minor modification * merge with parent's master * doc fix / improve * doc fix! Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
This commit is contained in:
parent
051d9b1d94
commit
d7f567126b
|
@ -120,7 +120,7 @@ Step 1: Define LightningModule
|
||||||
optimizer = torch.optim.Adam(self.parameters(), lr=1e-3)
|
optimizer = torch.optim.Adam(self.parameters(), lr=1e-3)
|
||||||
return optimizer
|
return optimizer
|
||||||
|
|
||||||
**SYTEM VS MODEL**
|
**SYSTEM VS MODEL**
|
||||||
|
|
||||||
A :ref:`lightning_module` defines a *system* not a model.
|
A :ref:`lightning_module` defines a *system* not a model.
|
||||||
|
|
||||||
|
@ -143,7 +143,7 @@ Under the hood a LightningModule is still just a :class:`torch.nn.Module` that g
|
||||||
- The Train loop
|
- The Train loop
|
||||||
- The Validation loop
|
- The Validation loop
|
||||||
- The Test loop
|
- The Test loop
|
||||||
- The Model + system architecture
|
- The Model or system of Models
|
||||||
- The Optimizer
|
- The Optimizer
|
||||||
|
|
||||||
You can customize any part of training (such as the backward pass) by overriding any
|
You can customize any part of training (such as the backward pass) by overriding any
|
||||||
|
|
Loading…
Reference in New Issue