added module properties docs
This commit is contained in:
parent
e00d097c12
commit
0bdb8533c6
|
@ -0,0 +1,25 @@
|
|||
A LightningModule has the following properties which you can access at any time
|
||||
|
||||
---
|
||||
#### current_epoch
|
||||
The current epoch
|
||||
|
||||
---
|
||||
#### dtype
|
||||
Current dtype
|
||||
|
||||
---
|
||||
#### global_step
|
||||
Total training batches seen across all epochs
|
||||
|
||||
---
|
||||
#### gradient_clip
|
||||
The current gradient clip value
|
||||
|
||||
---
|
||||
#### on_gpu
|
||||
True if your model is currently running on GPUs. Useful to set flags around the LightningModule for different CPU vs GPU behavior.
|
||||
|
||||
---
|
||||
#### Trainer
|
||||
Last resort access to any state the trainer has. Changing certain properties here could affect your training run.
|
|
@ -11,9 +11,9 @@
|
|||
- [GPU cluster Trainer](https://github.com/williamFalcon/pytorch-lightning/blob/master/examples/new_project_templates/trainer_gpu_cluster_template.py)
|
||||
|
||||
###### Quick start examples
|
||||
- [CPU example](https://williamfalcon.github.io/pytorch-lightning/Examples/#CPU-hyperparameter-search)
|
||||
- [Hyperparameter search on single GPU](https://williamfalcon.github.io/pytorch-lightning/Examples/#Hyperparameter-search-on-a-single-or-multiple-GPUs)
|
||||
- [Hyperparameter search on multiple GPUs on same node](https://williamfalcon.github.io/pytorch-lightning/Examples/#Hyperparameter-search-on-a-single-or-multiple-GPUs)
|
||||
- [CPU example](https://williamfalcon.github.io/pytorch-lightning/Examples/#cpu-hyperparameter-search)
|
||||
- [Hyperparameter search on single GPU](https://williamfalcon.github.io/pytorch-lightning/Examples/#hyperparameter-search-on-a-single-or-multiple-gpus)
|
||||
- [Hyperparameter search on multiple GPUs on same node](https://williamfalcon.github.io/pytorch-lightning/Examples/#hyperparameter-search-on-a-single-or-multiple-gpus)
|
||||
- [Hyperparameter search on a SLURM HPC cluster](https://williamfalcon.github.io/pytorch-lightning/Examples/#Hyperparameter search on a SLURM HPC cluster)
|
||||
|
||||
|
||||
|
|
|
@ -24,7 +24,6 @@ class LightningModule(GradInformation, ModelIO, OptimizerConfig, ModelHooks):
|
|||
self.fast_dev_run = hparams.fast_dev_run
|
||||
self.overfit = hparams.overfit
|
||||
self.gradient_clip = hparams.gradient_clip
|
||||
self.num = 2
|
||||
self.trainer = None
|
||||
self.from_lightning = True
|
||||
|
||||
|
|
Loading…
Reference in New Issue