lightning/pytorch_lightning/callbacks
scart97 eb15abcd82
Fix finetuning complex models correctly unfreezes. (#6880)
Co-authored-by: Carlos Mocholi <carlossmocholi@gmail.com>
2021-04-08 12:59:06 +05:30
..
__init__.py cleaning SWA (#6259) 2021-03-01 19:10:27 +01:00
base.py [Model Parallel] Add configure sharded model hook (#6679) 2021-03-29 14:50:51 -06:00
early_stopping.py Remove legacy `Result` parameters (#6016) 2021-03-28 11:55:08 +02:00
finetuning.py Fix finetuning complex models correctly unfreezes. (#6880) 2021-04-08 12:59:06 +05:30
gpu_stats_monitor.py Document exceptions in callbacks (#5541) 2021-02-15 10:24:36 +00:00
gradient_accumulation_scheduler.py Add on_epoch_start to run at the beginning of every loop irrespective of train/val/test (#6498) 2021-03-25 14:20:49 +01:00
lambda_function.py [Model Parallel] Add configure sharded model hook (#6679) 2021-03-29 14:50:51 -06:00
lr_monitor.py Document exceptions in callbacks (#5541) 2021-02-15 10:24:36 +00:00
model_checkpoint.py Remove legacy `Result` parameters (#6016) 2021-03-28 11:55:08 +02:00
progress.py Fix validation progress counter with check_val_every_n_epoch > 1 (#5952) 2021-04-02 17:40:41 +09:00
pruning.py Sanitize `None` params during pruning (#6836) 2021-04-06 01:47:59 +02:00
quantization.py Docs for Pruning, Quantization, and SWA (#6041) 2021-02-18 13:51:51 +00:00
stochastic_weight_avg.py Enforce an epoch scheduler interval when using SWA (#6588) 2021-04-06 02:57:33 +00:00