lightning/pytorch_lightning/plugins
Sean Naren bacabaebaf
Sharded Accelerator 1/n: Expose clip gradients to plugins via abstract class (#4639)
* Added abstract precision plugin to expose clip_gradients function, use within accelerator to clip gradients

* Exclude model from override, keep optimizer (needed for sharded clip gradients), add override for O2 support apex

* Fix doc

* Applied codereview changes

* Refactored clip function to encapsulate tpu changes with tpu accelerator. Default to standard clip function for vanilla torch

* Pass correct grad clip val

* Moved var to property

* Apply code review suggestions
2020-11-12 17:18:09 +00:00
..
__init__.py ref: apex plugin (#3502) 2020-09-15 06:02:42 -04:00
apex.py Sharded Accelerator 1/n: Expose clip gradients to plugins via abstract class (#4639) 2020-11-12 17:18:09 +00:00
ddp_plugin.py Enable DDP Plugin to pass through args to LightningDistributedDataParallel (#4382) 2020-10-27 12:27:59 +00:00
native_amp.py Sharded Accelerator 1/n: Expose clip gradients to plugins via abstract class (#4639) 2020-11-12 17:18:09 +00:00
plugin_connector.py Enable custom apex and amp plugins (#4355) 2020-10-25 17:11:07 -04:00
precision_plugin.py Sharded Accelerator 1/n: Expose clip gradients to plugins via abstract class (#4639) 2020-11-12 17:18:09 +00:00