lightning/pytorch_lightning/callbacks
chaton 02152c1729
Simplify optimization Logic (#4984)
* Rely on ddp plugin for blocking sync behaviour, and skip if we're using manual optimization

* debug

* Revert "debug"

This reverts commit ccca6b6b

* Expose manual reduce for automatic optimization

* Add input arguments

* Enable parity test

* clean imports

* Expose hook after to ensure we reset

* Fix naming

* add

* fix test

* uniformize optimizer logic

* resolve test

* resovle flake8

* resolve amp bug

* update tests

* remove bug

* remove optimizer_step in accelerators

* typo

* update lightning optimizer

* set doesn't work with ddp_spawn

* resolve flake8

* update threshold

* ignore pyright

* correct codeFactor

* remove useless if

* remove zer_grad function

* simplify step

* remove typo

* resolve bug

* Apply suggestions from code review

* update on comments

* resolve bugs

* remove tests

* Update pytorch_lightning/trainer/configuration_validator.py

Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>

* simplify testing

* add more tests

Co-authored-by: SeanNaren <sean@grid.ai>
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
2020-12-07 12:55:49 +00:00
..
__init__.py
base.py
early_stopping.py Deprecate auto mode from ModelCheckpoint and EarlyStopping (#4695) 2020-12-04 16:11:58 +01:00
gpu_stats_monitor.py
gradient_accumulation_scheduler.py Simplify optimization Logic (#4984) 2020-12-07 12:55:49 +00:00
lr_monitor.py Update lr_monitor.py (#4826) 2020-11-24 02:07:33 -05:00
model_checkpoint.py Added changeable extension variable for model checkpoints (#4977) 2020-12-06 22:58:50 +05:30
progress.py