lightning/pytorch_lightning
Carlos Mocholí 03f01fb5ec
Fix gradient norm tracking and gradient clipping (#9287)
* WIP

* Progress

* Undo test change

* Fix plugin closure execution order

* Update CHANGELOG

* Fix manual optimization on AMP and skipping backward

* Fix for deepspeed

* Typo

* Hook test for manual closure

* Add skipping test with AMP

* You are hideous, apex

* Add deepspeed test

* Update CHANGELOG

* Fix for broken master

* Add RunIf

* FIXMEs

* Rename

* Fix grad norm

* add a simple test

* update test

* update  test

* update test

* fix merge conflicts

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Sea of changes

* Undo change

* Introduce TPUPrecisionPlugin

* Undo changes

* Undo changes

* Resolve FIXME

* Undo change

* Undo change

* Undo change

* Fix FIXMEs

* Fix FIXME

* Correct value

* Bad merge

* Fix circular imports

* WIP

* Fixing clipping

* Fixes

* Bad merge

* Move optimizer step and clipping into the `PrecisionPlugin`

* Fix AMP

* Update CHANGELOG

* Fix tests

* Underscore

* Progress

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Remove pre_optimizer_step

* Missed one

* Progress

* Progress

* Fix test

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Update FIXMEs

* Fix test

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Fix test

* DeepSpeed warning. mypy

* Rename

* Finish tests

* Update CHANGELOG

* Dumb fixes

* accelerator=auto

* Apply suggestions from code review

Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>

* Update on comments

* Use ClassifModule

Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
2021-10-28 15:23:27 +00:00
..
accelerators Fix gradient norm tracking and gradient clipping (#9287) 2021-10-28 15:23:27 +00:00
callbacks Deprecate `lr_sch_names` from `LearningRateMonitor` (#10066) 2021-10-28 12:57:04 +00:00
core Replace `_TORCH_GREATER_EQUAL_DEV_1_10` with `_TORCH_GREATER_EQUAL_1_10` (#10157) 2021-10-27 13:38:39 +01:00
distributed Deprecate LightningDistributed and keep logic in ddp/ddpSpawn directly (#9691) 2021-09-25 08:39:15 -07:00
loggers Group all the logged gradients under the same sub-folder (#7756) 2021-10-20 15:48:36 +00:00
loops Fix gradient norm tracking and gradient clipping (#9287) 2021-10-28 15:23:27 +00:00
overrides Update strategy flag in docs (#10000) 2021-10-20 21:02:53 +05:30
plugins Fix gradient norm tracking and gradient clipping (#9287) 2021-10-28 15:23:27 +00:00
profiler Make sure file and folder exists in Profiler (#10073) 2021-10-26 11:13:31 +00:00
trainer Pass the scaler as an input to `NativeMixedPrecisionPlugin` (#10055) 2021-10-28 14:13:53 +00:00
tuner Mark `trainer.data_connector` as protected (#10031) 2021-10-25 12:29:09 +01:00
utilities Fix `reset_seed()` converting the `PL_SEED_WORKERS` environment variable `str` read to `bool` (#10099) 2021-10-28 12:57:41 +00:00
__about__.py Prepare v1.5.0rc1 (#10068) 2021-10-22 11:00:35 -04:00
__init__.py CI: yesqa (#8564) 2021-08-02 16:05:56 +00:00
py.typed
setup_tools.py CI: precommit - docformatter (#8584) 2021-09-06 12:49:09 +00:00