Remove AcceleratorConnector.devices (#12435)

This commit is contained in:
DuYicong515 2022-03-24 17:35:46 -07:00 committed by GitHub
parent 6329be60be
commit b5b951b05a
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
2 changed files with 3 additions and 9 deletions

View File

@ -792,6 +792,9 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
- Removed `AcceleratorConnector.parallel_device_ids` property ([#12072](https://github.com/PyTorchLightning/pytorch-lightning/pull/12072))
- Removed `AcceleratorConnector.devices` property ([#12435](https://github.com/PyTorchLightning/pytorch-lightning/pull/12435))
### Fixed
- Fixed an issue where `ModelCheckpoint` could delete older checkpoints when `dirpath` has changed during resumed training ([#12045](https://github.com/PyTorchLightning/pytorch-lightning/pull/12045))

View File

@ -59,7 +59,6 @@ from pytorch_lightning.strategies import (
DeepSpeedStrategy,
HorovodStrategy,
IPUStrategy,
ParallelStrategy,
SingleDeviceStrategy,
SingleTPUStrategy,
Strategy,
@ -780,14 +779,6 @@ class AcceleratorConnector:
def parallel_devices(self) -> List[Union[torch.device, int]]:
return self._parallel_devices
@property
def devices(self) -> int:
if isinstance(self.strategy, SingleDeviceStrategy):
return 1
elif isinstance(self.strategy, ParallelStrategy):
return len(self.strategy.parallel_devices)
return 0
@property
def tpu_cores(self) -> Optional[Union[List[int], int]]:
if isinstance(self.accelerator, TPUAccelerator):