lightning/pytorch_lightning/trainer
deepsource-autofix[bot] 2e7007c972
Refactor unnecessary `else` / `elif` when `if` block has a `raise` statement (#8403)
Co-authored-by: deepsource-autofix[bot] <62050782+deepsource-autofix[bot]@users.noreply.github.com>
2021-07-14 14:56:14 +00:00
..
connectors Refactor unnecessary `else` / `elif` when `if` block has a `raise` statement (#8403) 2021-07-14 14:56:14 +00:00
__init__.py added trainer api docs (#4569) 2020-11-07 14:18:45 -05:00
callback_hook.py Add the `on_before_optimizer_step` hook (#8048) 2021-07-09 13:30:52 +02:00
configuration_validator.py [feat] Allow overriding optimizer_zero_grad and/or optimizer_step when using accumulate_grad_batches (#7980) 2021-06-17 12:50:37 +02:00
data_loading.py Support state restoration of logged results 2/2(#7966) 2021-06-25 19:16:11 +00:00
deprecated_api.py rename old `Trainer.train_loop` -> `Trainer.fit_loop` (#8025) 2021-06-22 11:49:32 +02:00
logging.py Actually show deprecation warnings and their line level [2/2] (#8002) 2021-06-21 18:51:53 +02:00
model_hooks.py Support state restoration of logged results 2/2(#7966) 2021-06-25 19:16:11 +00:00
optimizers.py Support state restoration of logged results 2/2(#7966) 2021-06-25 19:16:11 +00:00
progress.py [Refactor] Improve loops API 1/n (#8334) 2021-07-12 11:13:50 +00:00
properties.py Enables reload of dataloaders on every n epochs from every epoch (#5043) 2021-07-07 13:10:08 +02:00
states.py `TrainerState` refactor [5/5] (#7173) 2021-05-04 12:50:56 +02:00
supporters.py Refactor unnecessary `else` / `elif` when `if` block has a `return` statement (#8156) 2021-06-28 15:27:41 +05:30
trainer.py Add logger flag to save_hyperparameters (#7960) 2021-07-13 11:36:36 +02:00
training_tricks.py Support state restoration of logged results 2/2(#7966) 2021-06-25 19:16:11 +00:00