6c18fd9a24
* Update lr_logger.py when logging learning_rate, we should provide different choices to log including 'step' and 'epoch' * Update lr_logger.py add some type annotations and docstrings * Update lr_logger.py fixed a bug where `on_train_batch_start()` can't be triggered, instead, we should use on_batch_start(); add `interval` args so that we can record learning_rates with respect to `global_step` or `current_epoch`. * Update lr_logger.py restore _extract_lr() * suggestion * Update lr_logger.py modify _extract_lr(), it no more need to pass `interval` parameter. * Update test_lr_logger.py SkafteNicki 's suggetion * log_interval now supports `None`, `step`, `epoch` * change `log_interval` to `logging_interval` * Update test_lr_logger.py * Update lr_logger.py * put types check into `on_train_start()` * cleanup * docstring typos * minor changes from suggestions Co-authored-by: Jirka Borovec <jirka@pytorchlightning.ai> Co-authored-by: rohitgr7 <rohitgr1998@gmail.com> Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com> Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com> |
||
---|---|---|
.. | ||
__init__.py | ||
test_callbacks.py | ||
test_early_stopping.py | ||
test_lr_logger.py | ||
test_model_checkpoint.py | ||
test_progress_bar.py |