lightning/pytorch_lightning/plugins
Danielle Pintz caed77f155
Refactor `TorchElasticEnvironment.detect` to use `torch.distributed.is_torchelastic_launched` (#12376)
* Refactor TorchElasticEnvironment.detect to use native utility from torch.distributed

* fix version and tests

* fix version

* Update tests/accelerators/test_accelerator_connector.py

Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>

Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
2022-03-21 16:51:24 +01:00
..
environments Refactor `TorchElasticEnvironment.detect` to use `torch.distributed.is_torchelastic_launched` (#12376) 2022-03-21 16:51:24 +01:00
io Support passing `storage_options` in `trainer.save_checkpoint()` API (#11891) 2022-03-09 18:35:50 +00:00
precision Move ipu precision flag check to IPUPrecisionPlugin init (#12148) 2022-03-05 09:03:24 +00:00
training_type Add deprecation path for renamed training type plugins (#11227) 2022-01-03 13:41:05 +01:00
__init__.py Add `SyncBatchNormPlugin` (#11754) 2022-03-01 19:41:40 +05:30
layer_sync.py Add `SyncBatchNormPlugin` (#11754) 2022-03-01 19:41:40 +05:30