lightning/pytorch_lightning/plugins
Jerome Anand fcf6e3ecba
Use hpu hmp for bf16 alone (#13028)
* Use hpu hmp for bf16 alone
* Update changelog

Co-authored-by: Kaushik B <45285388+kaushikb11@users.noreply.github.com>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2022-05-12 09:36:28 -04:00
..
environments Refactor `TorchElasticEnvironment.detect` to use `torch.distributed.is_torchelastic_launched` (#12376) 2022-03-21 16:51:24 +01:00
io Add support for Habana accelerator (HPU) (#11808) 2022-03-25 10:24:52 +00:00
precision Use hpu hmp for bf16 alone (#13028) 2022-05-12 09:36:28 -04:00
training_type Add deprecation path for renamed training type plugins (#11227) 2022-01-03 13:41:05 +01:00
__init__.py Add support for Habana accelerator (HPU) (#11808) 2022-03-25 10:24:52 +00:00
layer_sync.py Add `SyncBatchNormPlugin` (#11754) 2022-03-01 19:41:40 +05:30