lightning/pytorch_lightning/core
David Chan c6e02e481e
[feat] Allow overriding optimizer_zero_grad and/or optimizer_step when using accumulate_grad_batches (#7980)
2021-06-17 12:50:37 +02:00
..
__init__.py
datamodule.py Remove rank_zero_only on DataModule prepare_data (#7945) 2021-06-12 12:50:29 +02:00
decorators.py Move parameter validation specific to TPU Training plugins (#7415) 2021-05-24 16:02:01 +00:00
grads.py Move grad_norm to a dedicated utilities file (#7292) 2021-04-30 09:19:22 -07:00
hooks.py Standardize positional datamodule and argument names (#7431) 2021-06-15 11:50:13 +00:00
lightning.py [feat] Allow overriding optimizer_zero_grad and/or optimizer_step when using accumulate_grad_batches (#7980) 2021-06-17 12:50:37 +02:00
memory.py Handle errors due to uninitailized parameters (#7642) 2021-06-14 15:56:03 +00:00
optimizer.py Loop Refactor 1/N - Training Loop (#7871) 2021-06-15 12:55:06 +00:00
saving.py