lightning/pytorch_lightning
Kaushik B 35a6a6c705
Maintain Backward compatibility for DeviceDtypeModuleMixin (#8474)
* Maintain Backward compatibility for DeviceDtypeModuleMixin

* Update message

* Add deprecation test

* Update test
2021-07-21 13:13:19 +02:00
..
accelerators [bugfix] Reduce memory leaks (#8490) 2021-07-21 11:37:05 +02:00
callbacks [bugfix] Re-compute accumulated_grad_batches (#8493) 2021-07-21 10:46:25 +00:00
core [bugfix] Prevent deepcopy of dataloaders / Trainer in SWA Callback (#8472) 2021-07-20 18:31:49 +00:00
distributed Fix broadcast for Windows minimal (#8331) 2021-07-07 22:01:34 +00:00
loggers Add the bound instance as method parameter (#8466) 2021-07-21 10:10:33 +00:00
loops Do not reset Loops total counters (#8475) 2021-07-19 18:22:47 +02:00
metrics Sync our torchmetrics wrappers after the 0.4 release (#8205) 2021-06-29 22:05:48 +00:00
overrides Move mixin to core (#8396) 2021-07-19 10:15:59 +02:00
plugins [bugfix] Reduce memory leaks (#8490) 2021-07-21 11:37:05 +02:00
profiler Use literal syntax instead of function calls to create data structure (#8406) 2021-07-14 10:32:13 +00:00
trainer [bugfix] Re-compute accumulated_grad_batches (#8493) 2021-07-21 10:46:25 +00:00
tuner rename old `Trainer.train_loop` -> `Trainer.fit_loop` (#8025) 2021-06-22 11:49:32 +02:00
utilities Maintain Backward compatibility for DeviceDtypeModuleMixin (#8474) 2021-07-21 13:13:19 +02:00
__about__.py prepare RC0 (#8399) 2021-07-15 21:25:29 +00:00
__init__.py
py.typed
setup_tools.py fix pip install (#7170) 2021-04-22 16:48:11 -04:00