lightning/pytorch_lightning
chaton c2e6e68c7e
optimizer clean up (#4658)
* add LightningOptimizer

* typo

* add mock closure

* typo

* remove logic in optimizer_step

* update

* update

* update

* desactivate LightningOptimizer for hovorod

* resolve flake

* typo

* check optimizer name

* change name

* added backward to LightningOptimizer

* remove use_lightning_optimizer

* move update

* simplify init

* resolve comments

* resolve bug

* update

* update

* resolve bugs

* resolve flake8

* set state

* work manual_optimizer_step

* add doc

* add enable_pl_optimizer

* make optimizer_step

* add make_optimizer_step

* add examples

* resolve test

* add test_optimizer_return_options_enable_pl_optimizer

* add enable_pl_optimizer=True

* update

* update tests

* resolve bugs

* update

* set Trainer to False

* update

* resolve bugs

* update

* remove from doc

* resolve bug

* typo

* update

* set to True

* simplification

* typo

* resolve horovod

* unwrap horovod

* remove Optimizer

* resolve horovod

* move logic to amp_backend

* doesn't seem to be pickable

* update

* add again

* resolve some bugs

* cleanup

* resolve bug with AMP

* change __repr__

* round at -12

* udpate

* update

* update

* remove from horovod

* typo

* add convert_to_lightning_optimizers in each accelerators

* typo

* forgot

* forgot a convert_to_lightning_optimizers

* update

* update

* update

* increase coverage

* update

* resolve flake8

* update

* remove useless code

* resolve comments + add support for LightningOptimizer base class

* resolve flake

* check optimizer get wrapped back

* resolve DDPSharded

* reduce code

* lightningoptimizer

* Update pytorch_lightning/core/optimizer.py

Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>

* Update pytorch_lightning/core/lightning.py

* remove reference to step function

* Apply suggestions from code review

* update on comments

* resolve

* Update CHANGELOG.md

* add back training_step in apex and native_amp

* rename optimizer_step

Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
Co-authored-by: William Falcon <waf2107@columbia.edu>
Co-authored-by: Sean Naren <sean.narenthiran@gmail.com>
2020-12-01 00:09:46 +00:00
..
accelerators optimizer clean up (#4658) 2020-12-01 00:09:46 +00:00
callbacks simplify imports xla / TPU (#4872) 2020-11-27 00:37:48 +01:00
cluster_environments ref: unify slurm and TE under backendPlugin 1/n (#4578) 2020-11-08 14:28:55 -05:00
core optimizer clean up (#4658) 2020-12-01 00:09:46 +00:00
distributed notices (#4118) 2020-10-13 07:18:07 -04:00
loggers simplify imports Omegaconf (#4873) 2020-11-27 01:00:56 +01:00
metrics Add formulas and references to metrics docs (#4823) 2020-11-25 09:05:30 +01:00
overrides Remove else check 2020-11-26 18:42:06 +00:00
plugins optimizer clean up (#4658) 2020-12-01 00:09:46 +00:00
profiler update 2020-11-27 17:48:51 +00:00
trainer optimizer clean up (#4658) 2020-12-01 00:09:46 +00:00
tuner Fix batch_arg_name bug (#4812) 2020-11-23 11:34:11 +05:30
utilities Merge branch 'master' into feature/plug 2020-11-27 10:00:05 +00:00
__init__.py Change version in __init__.py 1.0.4 -> 1.1-dev (#4760) 2020-11-19 07:56:30 +06:30
py.typed make PyTorch Lightning PEP 561 Compliant (#3187) 2020-09-09 13:37:03 -04:00