lightning/pytorch_lightning/plugins/precision
Dusan Drevicky 1b06edf2f2
Add the `on_before_optimizer_step` hook (#8048)
Co-authored-by: Carlos Mocholi <carlossmocholi@gmail.com>
2021-07-09 13:30:52 +02:00
..
__init__.py FSDP with full state dict (#7487) 2021-05-24 08:11:45 +01:00
apex_amp.py Add the `on_before_optimizer_step` hook (#8048) 2021-07-09 13:30:52 +02:00
deepspeed_precision.py Add the `on_before_optimizer_step` hook (#8048) 2021-07-09 13:30:52 +02:00
double.py Remove methods with unnecessary super delegation. (#8148) 2021-07-02 08:00:55 +00:00
fully_sharded_native_amp.py FSDP with full state dict (#7487) 2021-05-24 08:11:45 +01:00
ipu_precision.py Refactor plugins backward (#8328) 2021-07-08 16:02:09 +02:00
mixed.py Use `torch.nn.utils.clip_grad_norm_` and add `clip_grad_by_value` support for TPU (#7025) 2021-05-07 16:41:39 +00:00
native_amp.py Add the `on_before_optimizer_step` hook (#8048) 2021-07-09 13:30:52 +02:00
precision_plugin.py Add the `on_before_optimizer_step` hook (#8048) 2021-07-09 13:30:52 +02:00
sharded_native_amp.py Typing for accelerators and plugins (#7022) 2021-04-15 16:48:16 +00:00
tpu_bfloat.py Create pytorch_lightning/utilities/types.py (#7048) 2021-04-19 14:43:16 +02:00