lightning/pytorch_lightning/callbacks
ananthsub 2f84459d26
Broadcast dirpath for tighter consistency in model checkpoint callback (#6978)
* Update model_checkpoint.py

* Update model_checkpoint.py

* Update model_checkpoint.py
2021-04-21 10:20:27 -07:00
..
__init__.py Add Trainer max_time argument + Callback (#6823) 2021-04-16 13:38:57 +02:00
base.py
early_stopping.py more early stopping options (convergence and divergence threshold) (#6868) 2021-04-19 16:49:52 +02:00
finetuning.py Fix finetuning complex models correctly unfreezes. (#6880) 2021-04-08 12:59:06 +05:30
gpu_stats_monitor.py Add typings for evaluation_loop.py and remove some dead code (#7015) 2021-04-15 07:36:04 +00:00
gradient_accumulation_scheduler.py
lambda_function.py [Model Parallel] Add configure sharded model hook (#6679) 2021-03-29 14:50:51 -06:00
lr_monitor.py
model_checkpoint.py Broadcast dirpath for tighter consistency in model checkpoint callback (#6978) 2021-04-21 10:20:27 -07:00
progress.py Fix validation progress counter with check_val_every_n_epoch > 1 (#5952) 2021-04-02 17:40:41 +09:00
pruning.py Add typings for evaluation_loop.py and remove some dead code (#7015) 2021-04-15 07:36:04 +00:00
quantization.py
stochastic_weight_avg.py Add SWA warning if not running every epoch (#6987) 2021-04-13 18:34:40 +02:00
timer.py Add Trainer max_time argument + Callback (#6823) 2021-04-16 13:38:57 +02:00