lightning/pytorch_lightning/plugins/precision
deepsource-autofix[bot] 7e2f84e050
Remove methods with unnecessary super delegation. (#8148)
* Remove methods with unnecessary super delegation.

* Update fully_sharded.py

* replace init in test

Co-authored-by: deepsource-autofix[bot] <62050782+deepsource-autofix[bot]@users.noreply.github.com>
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
Co-authored-by: Ethan Harris <ethanwharris@gmail.com>
2021-07-02 08:00:55 +00:00
..
__init__.py FSDP with full state dict (#7487) 2021-05-24 08:11:45 +01:00
apex_amp.py Support state restoration of logged results 2/2(#7966) 2021-06-25 19:16:11 +00:00
deepspeed_precision.py Add back `clip_gradients(model)` (#7231) 2021-04-27 11:34:02 +00:00
double.py Remove methods with unnecessary super delegation. (#8148) 2021-07-02 08:00:55 +00:00
fully_sharded_native_amp.py FSDP with full state dict (#7487) 2021-05-24 08:11:45 +01:00
ipu_precision.py IPU Integration 5/5 (#7867) 2021-06-11 15:07:04 +00:00
mixed.py Use `torch.nn.utils.clip_grad_norm_` and add `clip_grad_by_value` support for TPU (#7025) 2021-05-07 16:41:39 +00:00
native_amp.py Make optimizers skippable when using amp (#7975) 2021-06-16 00:23:30 +00:00
precision_plugin.py move amp checkpoint state management to precision plugin (#7831) 2021-06-07 07:45:01 +00:00
sharded_native_amp.py Typing for accelerators and plugins (#7022) 2021-04-15 16:48:16 +00:00
tpu_bfloat.py Create pytorch_lightning/utilities/types.py (#7048) 2021-04-19 14:43:16 +02:00