Carlos Mocholí
3ee3c42035
Prepare 1.1.3 release ( #5365 )
...
* Prepare 1.1.3 release
* Fix flake8 error
* suppress
* Remove 1.1.4 section
* Add missing commits to CHANGELOG
* Update PR template
* Add missing commit
* fix
* Update CHANGELOG.md
* Apply suggestions from code review
* Apply suggestions from code review
Co-authored-by: Jirka Borovec <jirka.borovec@seznam.cz>
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
(cherry picked from commit 4d9db866a1
)
2021-01-06 15:17:27 +01:00
chaton
56437e98a6
[bug-fix] Trainer.test points to latest best_model_path ( #5161 )
...
* resolve bug
* update code
* add set -e
* Update pytorch_lightning/callbacks/model_checkpoint.py
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
* update test
* Update tests/checkpointing/test_trainer_checkpoint.py
Co-authored-by: Sean Naren <sean.narenthiran@gmail.com>
* Update tests/checkpointing/test_trainer_checkpoint.py
Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
* update on comments
* resolve test
* convert to set
* update
* add error triggering
* update
* update on comments
* update
* resolve import
* update
* update
* Update pytorch_lightning/plugins/rpc_plugin.py
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
* update
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
Co-authored-by: Sean Naren <sean.narenthiran@gmail.com>
Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
Co-authored-by: Ubuntu <ubuntu@ip-172-31-62-109.ec2.internal>
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
(cherry picked from commit d5b367871f
)
2021-01-06 15:14:10 +01:00
Heewon Jeon(gogamza)
c0e9a78db4
supports --num-nodes on DDPSequentialPlugin() ( #5327 )
...
(cherry picked from commit d20fd8e5ab
)
2021-01-06 12:48:17 +01:00
Jirka Borovec
b72ed71d4e
Refactor: clean trainer device & distrib setters ( #5297 )
...
* naive replace
* simplify
* clean
* .
* fix
* .
* fix
* fix
2021-01-04 17:10:13 +00:00
Jirka Borovec
0f36525e8f
fix/enable - check F401 ( #5201 )
...
* refactor - check F401
* missed
* fix
2020-12-21 10:15:04 +01:00
Jirka Borovec
2d54116baa
annotat unused vars ( #5017 )
...
* annotate all unused vars
* rank_zero_warn
* Apply suggestions from code review
Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
* f1 fixed
Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
2020-12-19 13:53:06 +01:00
Jirka Borovec
059eaecbb4
set xxx_AVAILABLE as protected ( #5082 )
...
* sett xxx_AVAILABLE as protected
* docs
2020-12-14 20:19:05 +05:30
chaton
ef8ef12fd0
[feat] pp 2/n ( #5026 )
...
* Added changes for RPC plugin
* Add missing kwargs
* Fix code format
* Loading refactors by introducing is_distributed var, fix optimizer step flow
* Add rpc guard
* Added docstrings and typing
* resolve comments
* Add additional rpc hook, refactor name of exit process hook for clarity
* remove annotation
* Modify behaviour to allow optional return, add test for rpc plugin
* resolve tests
* rename is_ddp_based
* update
* update for windows
* update
* resolve test
* code smell
* Added sequential plugin
* resolve bug
* update
* cleanup
* add Exception
* resolve docs
* Remove ddp support
* Revert distributed -> ddp
* Update pl_examples/basic_examples/conv_sequential_example.py
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
* Update pl_examples/basic_examples/conv_sequential_example.py
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
* Update pytorch_lightning/plugins/ddp_sequential_plugin.py
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
* Address code review points
* Update pytorch_lightning/plugins/ddp_sequential_plugin.py
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
* Update pytorch_lightning/plugins/ddp_sequential_plugin.py
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
* Add missing return
* Fix formatting, add datamodule args
* add small comment
* resolve comments
* resolve comments
* update source for fairscale
* update extras
* remove staticmethod
* resolve flake8
* Skip tests that are failing due to bug upstream with multiple optimizers and shard
* update
* update on comments
* clean test
* latest comments
* remove old comments
* add todo
* Update version
* update
* resolve bugs
* resolve bugs
* update test
* remove hanging test
* Update pytorch_lightning/plugins/ddp_sequential_plugin.py
Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
* resolve on comments
* Update pytorch_lightning/plugins/ddp_sequential_plugin.py
Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
* resolve on comments
* Update pytorch_lightning/plugins/ddp_sequential_plugin.py
Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
* Update pytorch_lightning/plugins/ddp_sequential_plugin.py
Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
* Update pytorch_lightning/plugins/ddp_sequential_plugin.py
Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
* Update pytorch_lightning/plugins/ddp_sequential_plugin.py
Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
* remove ImportError
Co-authored-by: SeanNaren <sean@grid.ai>
Co-authored-by: Sean Naren <sean.narenthiran@gmail.com>
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
2020-12-09 12:56:51 +00:00
Sean Naren
ee9b3fe574
[feat] pp 1/n ( #5016 )
...
* Added changes for RPC plugin
* Add missing kwargs
* Fix code format
* Loading refactors by introducing is_distributed var, fix optimizer step flow
* Add rpc guard
* Added docstrings and typing
* resolve comments
* Add additional rpc hook, refactor name of exit process hook for clarity
* remove annotation
* Modify behaviour to allow optional return, add test for rpc plugin
* resolve tests
* rename is_ddp_based
* update
* update for windows
* update
* resolve test
* code smell
* Revert back to init_ddp_connection for backwards compat
* Swap to explicit name for property
* Add missing speed parity increase for CI variability, fix call counts for child process
Co-authored-by: tchaton <thomas@grid.ai>
2020-12-08 22:02:10 +00:00
chaton
2393474350
[hotfix] ddp + manual_optimisation ( #4976 )
...
* Rely on ddp plugin for blocking sync behaviour, and skip if we're using manual optimization
* debug
* Revert "debug"
This reverts commit ccca6b6b
* Expose manual reduce for automatic optimization
* Add input arguments
* Enable parity test
* clean imports
* Expose hook after to ensure we reset
* Fix naming
* add
* fix test
* resolve on comments
* typo
* Update tests/trainer/optimization/test_manual_optimization.py
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
* Update tests/trainer/optimization/test_manual_optimization.py
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
* update on comments
* resolve comments
Co-authored-by: SeanNaren <sean@grid.ai>
Co-authored-by: Sean Naren <sean.narenthiran@gmail.com>
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
2020-12-07 19:31:54 +00:00
chaton
02152c1729
Simplify optimization Logic ( #4984 )
...
* Rely on ddp plugin for blocking sync behaviour, and skip if we're using manual optimization
* debug
* Revert "debug"
This reverts commit ccca6b6b
* Expose manual reduce for automatic optimization
* Add input arguments
* Enable parity test
* clean imports
* Expose hook after to ensure we reset
* Fix naming
* add
* fix test
* uniformize optimizer logic
* resolve test
* resovle flake8
* resolve amp bug
* update tests
* remove bug
* remove optimizer_step in accelerators
* typo
* update lightning optimizer
* set doesn't work with ddp_spawn
* resolve flake8
* update threshold
* ignore pyright
* correct codeFactor
* remove useless if
* remove zer_grad function
* simplify step
* remove typo
* resolve bug
* Apply suggestions from code review
* update on comments
* resolve bugs
* remove tests
* Update pytorch_lightning/trainer/configuration_validator.py
Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
* simplify testing
* add more tests
Co-authored-by: SeanNaren <sean@grid.ai>
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
2020-12-07 12:55:49 +00:00
Sean Naren
a51477ff25
Fix exception message for invalid custom string plugin ( #4979 )
...
* Fix exception error from generator to list of valid names
* Update pytorch_lightning/plugins/plugin_connector.py
Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
2020-12-05 23:36:05 +00:00
Sean Naren
e952dee292
Allow string plugins ( #4888 )
...
* Allow plugin to be chosen via string
* Fix implementation, add tests
* Fix codefactor issues
* Added missing env patch
* Skip test for windows
* Reword reason
* Add skip to invalid test
* Create required_plugins function, move sharded amp requirement to plugin
* Pass AMPType, fix setter for apex
* Better doc strings
* Add exception when using apex
* Add trainer available_plugins function, warn user when plugins have been added automatically with option to override behaviour
* Fixed pep8 indent
* Fix codefactor issues
* Add env variables
* Update pytorch_lightning/cluster_environments/cluster_environment.py
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
* Addressed code review
* Update pytorch_lightning/plugins/plugin_connector.py
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
* Update pytorch_lightning/plugins/plugin_connector.py
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
* Update pytorch_lightning/plugins/plugin_connector.py
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
* Addressed more code review feedback
* Fixed docstrings
* Swapped to verbose runtime error
* Apply suggestions from code review
* Apply suggestions from code review
* Update pytorch_lightning/plugins/sharded_plugin.py
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
* Change name
* Pass trainer to plugins that may require it
* Fix sharded plugin
* Added test to ensure string sharded works
* Removed trainer typing as this breaks pep8
* Fixed doc issues
* Fixed tests
Co-authored-by: chaton <thomas@grid.ai>
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
2020-12-01 20:30:49 +00:00
chaton
c2e6e68c7e
optimizer clean up ( #4658 )
...
* add LightningOptimizer
* typo
* add mock closure
* typo
* remove logic in optimizer_step
* update
* update
* update
* desactivate LightningOptimizer for hovorod
* resolve flake
* typo
* check optimizer name
* change name
* added backward to LightningOptimizer
* remove use_lightning_optimizer
* move update
* simplify init
* resolve comments
* resolve bug
* update
* update
* resolve bugs
* resolve flake8
* set state
* work manual_optimizer_step
* add doc
* add enable_pl_optimizer
* make optimizer_step
* add make_optimizer_step
* add examples
* resolve test
* add test_optimizer_return_options_enable_pl_optimizer
* add enable_pl_optimizer=True
* update
* update tests
* resolve bugs
* update
* set Trainer to False
* update
* resolve bugs
* update
* remove from doc
* resolve bug
* typo
* update
* set to True
* simplification
* typo
* resolve horovod
* unwrap horovod
* remove Optimizer
* resolve horovod
* move logic to amp_backend
* doesn't seem to be pickable
* update
* add again
* resolve some bugs
* cleanup
* resolve bug with AMP
* change __repr__
* round at -12
* udpate
* update
* update
* remove from horovod
* typo
* add convert_to_lightning_optimizers in each accelerators
* typo
* forgot
* forgot a convert_to_lightning_optimizers
* update
* update
* update
* increase coverage
* update
* resolve flake8
* update
* remove useless code
* resolve comments + add support for LightningOptimizer base class
* resolve flake
* check optimizer get wrapped back
* resolve DDPSharded
* reduce code
* lightningoptimizer
* Update pytorch_lightning/core/optimizer.py
Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
* Update pytorch_lightning/core/lightning.py
* remove reference to step function
* Apply suggestions from code review
* update on comments
* resolve
* Update CHANGELOG.md
* add back training_step in apex and native_amp
* rename optimizer_step
Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
Co-authored-by: William Falcon <waf2107@columbia.edu>
Co-authored-by: Sean Naren <sean.narenthiran@gmail.com>
2020-12-01 00:09:46 +00:00
SeanNaren
1704773712
Address code review
2020-11-27 14:50:12 +00:00
SeanNaren
bde2a12990
Fix var name
2020-11-27 10:37:49 +00:00
SeanNaren
737447fc6e
Merge branch 'master' into feature/plug
...
# Conflicts:
# pytorch_lightning/trainer/connectors/precision_connector.py
# pytorch_lightning/utilities/__init__.py
2020-11-26 23:02:36 +00:00
Jirka Borovec
11e73ceaa6
fix import and typo in AMP ( #4871 )
...
* fix import and typo
* docs
* apex
* fix
* typo
2020-11-26 23:45:52 +01:00
SeanNaren
ab655e5118
Removed old eval logic, added eval tests
2020-11-26 18:49:06 +00:00
SeanNaren
8a0c8fe0bd
Fixed imports, swap to relying on function for entire batch
2020-11-26 16:48:21 +00:00
SeanNaren
47c121ef1a
Addressed code review points
2020-11-26 16:44:45 +00:00
SeanNaren
091c236392
Ensure we do windows check first
2020-11-26 11:07:11 +00:00
SeanNaren
8f9763166d
Add check to fairscale override
2020-11-26 10:59:23 +00:00
SeanNaren
80e5329c1f
Add check for windows to plugin
2020-11-26 10:24:25 +00:00
SeanNaren
79527672cb
Remove amp check as guard now upstream
2020-11-26 10:13:27 +00:00
SeanNaren
6c8715e739
Swap ordering of imports
2020-11-25 21:31:36 +00:00
SeanNaren
321e63ae8b
Fixes to import
2020-11-25 21:17:21 +00:00
SeanNaren
6b93987b31
Revert "Add check to ensure 1.6"
...
This reverts commit ba312473
2020-11-25 21:01:42 +00:00
SeanNaren
cf7a7f7b8d
Add additional else check
2020-11-25 20:20:45 +00:00
SeanNaren
888b12bbc9
Add additional else check
2020-11-25 20:20:45 +00:00
SeanNaren
ba312473f8
Add check to ensure 1.6
2020-11-25 19:40:58 +00:00
SeanNaren
bfe754da12
Removed comments, skip test
2020-11-25 12:55:02 +00:00
SeanNaren
17f23e5e66
Ensure imports are not required explicitly for type casting
2020-11-24 20:11:12 +00:00
SeanNaren
6b129216d0
Add catches around fairscale installation
2020-11-24 19:23:55 +00:00
SeanNaren
f765364c02
Fixed configure_ddp, removed lr scheduler modification, added unit tests
2020-11-24 18:05:00 +00:00
SeanNaren
08d37d9cd2
Fixed name ref
2020-11-23 20:20:19 +00:00
SeanNaren
d953f2be5b
Merge branch 'master' into feature/fairscale-817-6n
...
# Conflicts:
# pytorch_lightning/accelerators/accelerator.py
# pytorch_lightning/accelerators/ddp2_accelerator.py
# pytorch_lightning/accelerators/ddp_accelerator.py
# pytorch_lightning/accelerators/ddp_cpu_spawn_accelerator.py
# pytorch_lightning/accelerators/ddp_hpc_accelerator.py
# pytorch_lightning/accelerators/ddp_spawn_accelerator.py
# pytorch_lightning/accelerators/dp_accelerator.py
# pytorch_lightning/plugins/ddp_plugin.py
# pytorch_lightning/trainer/connectors/model_connector.py
2020-11-23 20:19:46 +00:00
Sean Naren
404af43cde
5/n: Extract reference model call to plugins/accelerators ( #4773 )
...
* Encapsulate extracting reference model within the plugin to allow custom wrapper logic to live within the plugin/accelerators
* Add missing new lines
* Fix call to accelerator
* Removed double blank
* Use accelerator backend
* Handle case where wrapper has not been initialized within the plugin
* Added basic get model tests, add better typing
* Change model name
* Split GPU/DDP test
* Add stronger typing, skip ddp test on windows
* Fix import
* Fix import in dp
* Fixed PEP8 definition
* Add ddp launcher for ddp testing
* Modify accelerator reference model to property, change name to reflect func
* Revert property as this is incorrect.=
* Revert across accelerators
* Modified name to get_model_from_plugin
* Code review changes, fix issue with dp
* Add verb to function getter
Co-authored-by: chaton <thomas@grid.ai>
2020-11-23 17:21:47 +00:00
SeanNaren
df416f6c78
Fix conversion in on_before_forward
2020-11-22 15:06:11 +00:00
SeanNaren
50ed083fc7
Add module wrapper code
2020-11-22 15:00:44 +00:00
SeanNaren
9c34589493
Assert availability via imports
2020-11-22 15:00:44 +00:00
SeanNaren
2e8585f46a
Add base code
2020-11-22 15:00:44 +00:00
SeanNaren
358f503848
Modify accelerator reference model to property, change name to reflect func
2020-11-22 11:39:00 +00:00
SeanNaren
15734e9dc9
Fixed PEP8 definition
2020-11-19 14:18:52 +00:00
SeanNaren
aebb1a30ff
Add stronger typing, skip ddp test on windows
2020-11-19 13:42:27 +00:00
SeanNaren
0864b1c893
Added basic get model tests, add better typing
2020-11-19 12:36:23 +00:00
SeanNaren
84ccdbf886
Handle case where wrapper has not been initialized within the plugin
2020-11-19 11:59:30 +00:00
SeanNaren
be4c24c484
Encapsulate extracting reference model within the plugin to allow custom wrapper logic to live within the plugin/accelerators
2020-11-19 10:43:16 +00:00
Sean Naren
f0ab74dc2f
Expose scaler in amp plugin ( #4737 )
2020-11-18 22:30:47 +00:00
ananthsub
45c57600af
Move init_ddp_connection to DDP Plugin ( #4407 )
...
* Move init_ddp_connection to DDP Plugin
* cluster-env
* trainer?
* imports
* Update ddp_plugin.py
Co-authored-by: Sean Naren <sean.narenthiran@gmail.com>
2020-11-18 15:49:22 -05:00