lightning/pytorch_lightning
Carlos Mocholí 48b6292cf0
Move optimizer step and clipping into the `PrecisionPlugin` (#10143)
2021-10-26 17:26:26 +02:00
..
accelerators Move optimizer step and clipping into the `PrecisionPlugin` (#10143) 2021-10-26 17:26:26 +02:00
callbacks Disable quantization aware training observers (#8540) 2021-10-25 15:46:09 +00:00
core Move optimizer step and clipping into the `PrecisionPlugin` (#10143) 2021-10-26 17:26:26 +02:00
distributed Deprecate LightningDistributed and keep logic in ddp/ddpSpawn directly (#9691) 2021-09-25 08:39:15 -07:00
loggers Group all the logged gradients under the same sub-folder (#7756) 2021-10-20 15:48:36 +00:00
loops Small code simplification in `training_epoch_loop.py` (#10146) 2021-10-26 13:22:36 +02:00
overrides Update strategy flag in docs (#10000) 2021-10-20 21:02:53 +05:30
plugins Move optimizer step and clipping into the `PrecisionPlugin` (#10143) 2021-10-26 17:26:26 +02:00
profiler Make sure file and folder exists in Profiler (#10073) 2021-10-26 11:13:31 +00:00
trainer Avoid deprecated warnings from accelerator and checkpoint connector #10142 2021-10-26 14:10:30 +02:00
tuner Mark `trainer.data_connector` as protected (#10031) 2021-10-25 12:29:09 +01:00
utilities Do not use the base version by default in `_compare_version` (#10051) 2021-10-25 16:41:32 +05:30
__about__.py Prepare v1.5.0rc1 (#10068) 2021-10-22 11:00:35 -04:00
__init__.py CI: yesqa (#8564) 2021-08-02 16:05:56 +00:00
py.typed
setup_tools.py CI: precommit - docformatter (#8584) 2021-09-06 12:49:09 +00:00