lightning/pytorch_lightning
Carlos Mocholí f95ba20012
Do not use the base version by default in `_compare_version` (#10051)
2021-10-25 16:41:32 +05:30
..
accelerators Update `optimizer_step` methods in accelerator and plugins (#10023) 2021-10-20 21:36:27 +01:00
callbacks Fix `LearningRateMonitor` logging with multiple param groups optimizer with no scheduler (#10044) 2021-10-20 22:13:00 +05:30
core Fix: skip importing DistributedOptimizer for Windows (#10071) 2021-10-21 21:01:56 +00:00
distributed Deprecate LightningDistributed and keep logic in ddp/ddpSpawn directly (#9691) 2021-09-25 08:39:15 -07:00
loggers Group all the logged gradients under the same sub-folder (#7756) 2021-10-20 15:48:36 +00:00
loops Group all the logged gradients under the same sub-folder (#7756) 2021-10-20 15:48:36 +00:00
overrides Update strategy flag in docs (#10000) 2021-10-20 21:02:53 +05:30
plugins Remove redundant require_backward_grad_sync=False in sharded plugins (#10065) 2021-10-22 03:07:58 +00:00
profiler fix noqa appearing in docs (#10116) 2021-10-25 10:27:00 +00:00
trainer Add support for init_meta_context, materialize_module (#9920) 2021-10-21 15:48:31 +01:00
tuner remove dataloader patching on the LightningModule (#9764) 2021-10-20 15:23:20 +02:00
utilities Do not use the base version by default in `_compare_version` (#10051) 2021-10-25 16:41:32 +05:30
__about__.py Prepare v1.5.0rc1 (#10068) 2021-10-22 11:00:35 -04:00
__init__.py CI: yesqa (#8564) 2021-08-02 16:05:56 +00:00
py.typed make PyTorch Lightning PEP 561 Compliant (#3187) 2020-09-09 13:37:03 -04:00
setup_tools.py CI: precommit - docformatter (#8584) 2021-09-06 12:49:09 +00:00