chaton
|
1a970b2d8d
|
[hotfix] Extend Optimizer + update doc (#5095)
* resolve urgent bug
* update pr
* update doc
* update
* remove typo
* add defaults
* Update pytorch_lightning/__init__.py
* Update setup.py
* update doc
* Update docs/source/optimizers.rst
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
* update
* resolve doc
* debug test
* update test
* Update docs/source/optimizers.rst
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
* Update docs/source/optimizers.rst
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
* Update docs/source/optimizers.rst
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
* remove useless import
* Update docs/source/optimizers.rst
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
|
2020-12-11 14:24:59 -05:00 |
chaton
|
7755572b4f
|
Check if optimizer supports closure (#4981)
* check if optimizer support closure
* cleanup test
* resolve tests
* resolve flake
* update test due to patch limit
* update
* update dep
* Update tests/core/test_lightning_optimizer.py
Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
* Update tests/core/test_lightning_optimizer.py
Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
* resolve bug
* update test
* resolve tests
* Update requirements/extra.txt
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
* remove bolts dep
* remove bolts
* add missing bolts dep for tests
* remove need for bolts
Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
|
2020-12-11 14:51:45 +01:00 |
Jirka Borovec
|
4ebce38478
|
update usage of deprecated automatic_optimization (#5011)
* drop deprecated usage automatic_optimization
* Apply suggestions from code review
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
* Apply suggestions from code review
Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
|
2020-12-10 15:31:33 +05:30 |
chaton
|
02152c1729
|
Simplify optimization Logic (#4984)
* Rely on ddp plugin for blocking sync behaviour, and skip if we're using manual optimization
* debug
* Revert "debug"
This reverts commit ccca6b6b
* Expose manual reduce for automatic optimization
* Add input arguments
* Enable parity test
* clean imports
* Expose hook after to ensure we reset
* Fix naming
* add
* fix test
* uniformize optimizer logic
* resolve test
* resovle flake8
* resolve amp bug
* update tests
* remove bug
* remove optimizer_step in accelerators
* typo
* update lightning optimizer
* set doesn't work with ddp_spawn
* resolve flake8
* update threshold
* ignore pyright
* correct codeFactor
* remove useless if
* remove zer_grad function
* simplify step
* remove typo
* resolve bug
* Apply suggestions from code review
* update on comments
* resolve bugs
* remove tests
* Update pytorch_lightning/trainer/configuration_validator.py
Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
* simplify testing
* add more tests
Co-authored-by: SeanNaren <sean@grid.ai>
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
|
2020-12-07 12:55:49 +00:00 |
chaton
|
c2e6e68c7e
|
optimizer clean up (#4658)
* add LightningOptimizer
* typo
* add mock closure
* typo
* remove logic in optimizer_step
* update
* update
* update
* desactivate LightningOptimizer for hovorod
* resolve flake
* typo
* check optimizer name
* change name
* added backward to LightningOptimizer
* remove use_lightning_optimizer
* move update
* simplify init
* resolve comments
* resolve bug
* update
* update
* resolve bugs
* resolve flake8
* set state
* work manual_optimizer_step
* add doc
* add enable_pl_optimizer
* make optimizer_step
* add make_optimizer_step
* add examples
* resolve test
* add test_optimizer_return_options_enable_pl_optimizer
* add enable_pl_optimizer=True
* update
* update tests
* resolve bugs
* update
* set Trainer to False
* update
* resolve bugs
* update
* remove from doc
* resolve bug
* typo
* update
* set to True
* simplification
* typo
* resolve horovod
* unwrap horovod
* remove Optimizer
* resolve horovod
* move logic to amp_backend
* doesn't seem to be pickable
* update
* add again
* resolve some bugs
* cleanup
* resolve bug with AMP
* change __repr__
* round at -12
* udpate
* update
* update
* remove from horovod
* typo
* add convert_to_lightning_optimizers in each accelerators
* typo
* forgot
* forgot a convert_to_lightning_optimizers
* update
* update
* update
* increase coverage
* update
* resolve flake8
* update
* remove useless code
* resolve comments + add support for LightningOptimizer base class
* resolve flake
* check optimizer get wrapped back
* resolve DDPSharded
* reduce code
* lightningoptimizer
* Update pytorch_lightning/core/optimizer.py
Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
* Update pytorch_lightning/core/lightning.py
* remove reference to step function
* Apply suggestions from code review
* update on comments
* resolve
* Update CHANGELOG.md
* add back training_step in apex and native_amp
* rename optimizer_step
Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
Co-authored-by: William Falcon <waf2107@columbia.edu>
Co-authored-by: Sean Naren <sean.narenthiran@gmail.com>
|
2020-12-01 00:09:46 +00:00 |