lightning/pytorch_lightning/accelerators
chaton f2e99d617f
deprecate enable_pl_optimizer as it is not restored properly (#5244)
* update

* clean test

* still in progress

* udpdate test

* update

* update

* resolve flake

* add test for zero_grad

* update

* works without accumulated_grad

* update

* update

* resolve amp

* revert back to True

* update

* clean tests

* cleaned out

* typo

* update test

* git repare bug

* remove print

* udpate

* Fix formatting/optimizer imports

* Refactor the test for cleanliness

* Add vanilla model to the test, better var names

* Fixed var names, let's clean up these mock tests

* repare test

* update test

* resolve flake8

* add manual_optimization

* update tests

* resolve flake8

* add random accumulate_grad_batches

* improve test

* Update tests/trainer/optimization/test_parity_automatic_optimization.py

Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>

* Update tests/trainer/optimization/test_parity_automatic_optimization.py

Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>

* update

* clean tests

* correct bug

* Apply suggestions from code review

* format

* adress comments

* update on comments

* wip

* typo

* depreceate enable_pl_optimizer

* resolve latest bugs

* update

* resolve merge

* add comment

* Update pytorch_lightning/core/lightning.py

Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>

* Update tests/deprecated_api/test_remove_1-3.py

Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>

* Update pytorch_lightning/trainer/connectors/optimizer_connector.py

Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>

* Update pytorch_lightning/trainer/trainer.py

Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>

* Update pytorch_lightning/trainer/trainer.py

Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>

* Update tests/trainer/optimization/test_parity_automatic_optimization.py

Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>

* update on comments

* update restore

* add a property

* remove setstate as not needed anymore

* update test

* provide optimizer to on_before_zero_grad

* update on comments

* update on comments

* Update pytorch_lightning/trainer/trainer.py

Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>

* Update tests/trainer/optimization/test_parity_automatic_optimization.py

Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>

* Update tests/trainer/optimization/test_parity_automatic_optimization.py

Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>

* Update tests/trainer/optimization/test_parity_automatic_optimization.py

Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>

* mofidy import

* update changelog

* resolve flake8

* update

* update

* clean doc

Co-authored-by: SeanNaren <sean@grid.ai>
Co-authored-by: Ubuntu <ubuntu@ip-172-31-62-109.ec2.internal>
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
Co-authored-by: Jirka Borovec <jirka.borovec@seznam.cz>
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
Co-authored-by: Sean Naren <sean.narenthiran@gmail.com>
2021-01-08 16:13:12 -05:00
..
__init__.py ref: unify slurm and TE under backendPlugin 3/n (#4581) 2020-11-08 15:32:37 -05:00
accelerator.py Initialize trainer with None in DDPAccelerator (#4915) 2020-12-10 15:24:44 +01:00
accelerator_connector.py ref: clean config [1/n] add intermediate setters (#4990) 2020-12-09 14:13:57 -05:00
cpu_accelerator.py deprecate enable_pl_optimizer as it is not restored properly (#5244) 2021-01-08 16:13:12 -05:00
ddp2_accelerator.py deprecate enable_pl_optimizer as it is not restored properly (#5244) 2021-01-08 16:13:12 -05:00
ddp_accelerator.py deprecate enable_pl_optimizer as it is not restored properly (#5244) 2021-01-08 16:13:12 -05:00
ddp_cpu_hpc_accelerator.py Fix hang in DDP HPC accelerators (#5157) 2020-12-16 21:07:11 +01:00
ddp_cpu_spawn_accelerator.py deprecate enable_pl_optimizer as it is not restored properly (#5244) 2021-01-08 16:13:12 -05:00
ddp_hpc_accelerator.py deprecate enable_pl_optimizer as it is not restored properly (#5244) 2021-01-08 16:13:12 -05:00
ddp_spawn_accelerator.py deprecate enable_pl_optimizer as it is not restored properly (#5244) 2021-01-08 16:13:12 -05:00
dp_accelerator.py deprecate enable_pl_optimizer as it is not restored properly (#5244) 2021-01-08 16:13:12 -05:00
gpu_accelerator.py deprecate enable_pl_optimizer as it is not restored properly (#5244) 2021-01-08 16:13:12 -05:00
horovod_accelerator.py deprecate enable_pl_optimizer as it is not restored properly (#5244) 2021-01-08 16:13:12 -05:00
tpu_accelerator.py deprecate enable_pl_optimizer as it is not restored properly (#5244) 2021-01-08 16:13:12 -05:00