Update learning rate on each backward pass instead of each forward pass. (#1477)

* change lr scheduler step interval to update every backwards pass instead of every forwards pass

* update CHANGELOG

* fix spacing

* Add TODO to lr schedule update

* remove trailing whitespace

Co-authored-by: William Falcon <waf2107@columbia.edu>
This commit is contained in:
Roshan Rao 2020-04-20 05:03:52 -07:00 committed by GitHub
parent 4fca994d0e
commit 0203938af8
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
2 changed files with 10 additions and 7 deletions

View File

@ -24,20 +24,21 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
- Changed the default behaviour to no longer include a NaN check with each training iteration. ([#1475](https://github.com/PyTorchLightning/pytorch-lightning/pull/1475))
- Changed lr schedule step interval behavior to update every backwards pass instead of every forwards pass ([#1476](https://github.com/PyTorchLightning/pytorch-lightning/issues/1476))
- Updated semantic segmentation example with custom u-net and logging ([#1371](https://github.com/PyTorchLightning/pytorch-lightning/pull/1371))
-
### Deprecated
-
-
### Removed
-
-
-
-
### Fixed
@ -52,7 +53,6 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
- Fixed a bug that caused the `callbacks` Trainer argument to reference a global variable ([#1534](https://github.com/PyTorchLightning/pytorch-lightning/pull/1534)).
## [0.7.3] - 2020-04-09
### Added

View File

@ -454,8 +454,11 @@ class TrainerTrainLoopMixin(ABC):
# when returning -1 from train_step, we end epoch early
early_stop_epoch = batch_result == -1
# update lr
self.update_learning_rates(interval='step')
# TODO: consolidate all actions that need to take place only after
# self.accumulate_grad_batches steps (optimizer step, lr update, global step increment)
if (self.batch_idx + 1) % self.accumulate_grad_batches == 0:
# update lr
self.update_learning_rates(interval='step')
# ---------------
# RUN VAL STEP