Remove unused `on_train_epoch_end` hook in accelerator (#9035)

This commit is contained in:
ananthsub 2021-08-22 11:50:10 -07:00 committed by GitHub
parent 930b81f96c
commit 8a931732ae
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
2 changed files with 3 additions and 4 deletions

View File

@ -166,6 +166,9 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
- Removed deprecated `connect_precision_plugin` and `connect_training_type_plugin` from `Accelerator` ([#9019](https://github.com/PyTorchLightning/pytorch-lightning/pull/9019))
- Removed `on_train_epoch_end` from `Accelerator` ([#9035](https://github.com/PyTorchLightning/pytorch-lightning/pull/9035))
### Fixed
- Ensure the existence of `DDPPlugin._sync_dir` in `reconciliate_processes` ([#8939](https://github.com/PyTorchLightning/pytorch-lightning/pull/8939))

View File

@ -479,10 +479,6 @@ class Accelerator:
def update_global_step(self, total_batch_idx: int, current_global_step: int) -> int:
return self.training_type_plugin.update_global_step(total_batch_idx, current_global_step)
def on_train_epoch_end(self) -> None:
"""Hook to do something on the end of an training epoch."""
pass
def on_train_start(self) -> None:
"""Called when train begins."""
return self.training_type_plugin.on_train_start()