lightning/pytorch_lightning
Kaushik B c3614f1c07
Fix: skip importing DistributedOptimizer for Windows (#10071)
2021-10-21 21:01:56 +00:00
..
accelerators Update `optimizer_step` methods in accelerator and plugins (#10023) 2021-10-20 21:36:27 +01:00
callbacks Fix `LearningRateMonitor` logging with multiple param groups optimizer with no scheduler (#10044) 2021-10-20 22:13:00 +05:30
core Fix: skip importing DistributedOptimizer for Windows (#10071) 2021-10-21 21:01:56 +00:00
distributed Deprecate LightningDistributed and keep logic in ddp/ddpSpawn directly (#9691) 2021-09-25 08:39:15 -07:00
loggers Group all the logged gradients under the same sub-folder (#7756) 2021-10-20 15:48:36 +00:00
loops Group all the logged gradients under the same sub-folder (#7756) 2021-10-20 15:48:36 +00:00
overrides Update strategy flag in docs (#10000) 2021-10-20 21:02:53 +05:30
plugins Fix: skip importing DistributedOptimizer for Windows (#10071) 2021-10-21 21:01:56 +00:00
profiler Fix `LightningOptimizer` step and toggling logic (#9958) 2021-10-18 00:23:51 +00:00
trainer Add support for init_meta_context, materialize_module (#9920) 2021-10-21 15:48:31 +01:00
tuner remove dataloader patching on the LightningModule (#9764) 2021-10-20 15:23:20 +02:00
utilities Fix: skip importing DistributedOptimizer for Windows (#10071) 2021-10-21 21:01:56 +00:00
__about__.py Prepare v1.5.0rc0 (#9893) 2021-10-11 20:36:01 +01:00
__init__.py CI: yesqa (#8564) 2021-08-02 16:05:56 +00:00
py.typed
setup_tools.py CI: precommit - docformatter (#8584) 2021-09-06 12:49:09 +00:00