lightning/pytorch_lightning
Adrian Wälchli 6ed7a0c172
Fix sigterm signal handling (#10189)
Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
2021-10-29 00:01:39 +00:00
..
accelerators Fix gradient norm tracking and gradient clipping (#9287) 2021-10-28 15:23:27 +00:00
callbacks Deprecate `lr_sch_names` from `LearningRateMonitor` (#10066) 2021-10-28 12:57:04 +00:00
core Replace `_TORCH_GREATER_EQUAL_DEV_1_10` with `_TORCH_GREATER_EQUAL_1_10` (#10157) 2021-10-27 13:38:39 +01:00
distributed Deprecate LightningDistributed and keep logic in ddp/ddpSpawn directly (#9691) 2021-09-25 08:39:15 -07:00
loggers Group all the logged gradients under the same sub-folder (#7756) 2021-10-20 15:48:36 +00:00
loops Fix gradient norm tracking and gradient clipping (#9287) 2021-10-28 15:23:27 +00:00
overrides Update strategy flag in docs (#10000) 2021-10-20 21:02:53 +05:30
plugins Fix gradient norm tracking and gradient clipping (#9287) 2021-10-28 15:23:27 +00:00
profiler Make sure file and folder exists in Profiler (#10073) 2021-10-26 11:13:31 +00:00
trainer Fix sigterm signal handling (#10189) 2021-10-29 00:01:39 +00:00
tuner Mark `trainer.data_connector` as protected (#10031) 2021-10-25 12:29:09 +01:00
utilities Fix `reset_seed()` converting the `PL_SEED_WORKERS` environment variable `str` read to `bool` (#10099) 2021-10-28 12:57:41 +00:00
__about__.py Prepare v1.5.0rc1 (#10068) 2021-10-22 11:00:35 -04:00
__init__.py CI: yesqa (#8564) 2021-08-02 16:05:56 +00:00
py.typed
setup_tools.py CI: precommit - docformatter (#8584) 2021-09-06 12:49:09 +00:00