Fix torchelastic detection with non-distributed installations (#13142)

* Fix torchelastic detection under Mac

* CHANGELOG
This commit is contained in:
Carlos Mocholí 2022-05-25 12:23:41 +02:00 committed by lexierule
parent 4621a64f62
commit 41b0ffe1f4
2 changed files with 3 additions and 1 deletions

View File

@ -18,6 +18,7 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
- Fixed the number of references of `LightningModule` so it can be deleted ([#12897](https://github.com/PyTorchLightning/pytorch-lightning/pull/12897))
- Fixed `materialize_module` setting a module's child recursively ([#12870](https://github.com/PyTorchLightning/pytorch-lightning/pull/12870))
- Fixed issue where the CLI could not pass a `Profiler` to the `Trainer` ([#13084](https://github.com/PyTorchLightning/pytorch-lightning/pull/13084))
- Fixed torchelastic detection with non-distributed installations ([#13142](https://github.com/PyTorchLightning/pytorch-lightning/pull/13142))
## [1.6.3] - 2022-05-03

View File

@ -62,7 +62,8 @@ class TorchElasticEnvironment(ClusterEnvironment):
def detect() -> bool:
"""Returns ``True`` if the current process was launched using the torchelastic command."""
if _TORCH_GREATER_EQUAL_1_9_1:
return torch.distributed.is_torchelastic_launched()
# if not available (for example on MacOS), `is_torchelastic_launched` is not defined
return torch.distributed.is_available() and torch.distributed.is_torchelastic_launched()
required_env_vars = {"RANK", "GROUP_RANK", "LOCAL_RANK", "LOCAL_WORLD_SIZE"}
return required_env_vars.issubset(os.environ.keys())