lightning/pytorch_lightning
Caldera 6c18fd9a24
Update lr_logger.py (#2847)
* Update lr_logger.py

when logging learning_rate, we should provide different choices to log including 'step' and 'epoch'

* Update lr_logger.py

add some type annotations and docstrings

* Update lr_logger.py

fixed a bug where `on_train_batch_start()` can't be triggered, instead, we should use on_batch_start(); add `interval` args so that we can record learning_rates with respect to `global_step` or `current_epoch`.

* Update lr_logger.py

restore _extract_lr()

* suggestion

* Update lr_logger.py

modify _extract_lr(), it no more need to pass `interval` parameter.

* Update test_lr_logger.py

SkafteNicki 's suggetion

* log_interval now supports `None`, `step`, `epoch`

* change `log_interval` to `logging_interval`

* Update test_lr_logger.py

* Update lr_logger.py

* put types check into `on_train_start()`

* cleanup

* docstring typos

* minor changes from suggestions

Co-authored-by: Jirka Borovec <jirka@pytorchlightning.ai>
Co-authored-by: rohitgr7 <rohitgr1998@gmail.com>
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com>
2020-08-09 16:30:43 +00:00
..
accelerators
callbacks Update lr_logger.py (#2847) 2020-08-09 16:30:43 +00:00
core Squashed commit of the following: (#2164) 2020-08-09 06:08:44 -04:00
loggers Squashed commit of the following: (#2164) 2020-08-09 06:08:44 -04:00
metrics fix reduction docstring and clean tests (#2885) 2020-08-09 06:03:24 -04:00
overrides
profiler
trainer Add tracking of basic states in Trainer [wip - to-be-merged after v0.9] (#2541) 2020-08-09 06:24:09 -04:00
utilities Squashed commit of the following: (#2164) 2020-08-09 06:08:44 -04:00
__init__.py Update __init__.py 2020-08-09 10:05:39 -04:00