lightning/pytorch_lightning
Jirka Borovec b7dbcc3e13
Quant as optional step (#8464)
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2021-07-22 12:44:27 +00:00
..
accelerators [bugfix] Reduce memory leaks (#8490) 2021-07-21 11:37:05 +02:00
callbacks Quant as optional step (#8464) 2021-07-22 12:44:27 +00:00
core [bugfix] Prevent deepcopy of dataloaders / Trainer in SWA Callback (#8472) 2021-07-20 18:31:49 +00:00
distributed Fix broadcast for Windows minimal (#8331) 2021-07-07 22:01:34 +00:00
loggers Add the bound instance as method parameter (#8466) 2021-07-21 10:10:33 +00:00
loops Do not reset Loops total counters (#8475) 2021-07-19 18:22:47 +02:00
metrics Sync our torchmetrics wrappers after the 0.4 release (#8205) 2021-06-29 22:05:48 +00:00
overrides Move mixin to core (#8396) 2021-07-19 10:15:59 +02:00
plugins fix: Enable manual optimization for TPUs (#8458) 2021-07-22 15:33:35 +05:30
profiler Use literal syntax instead of function calls to create data structure (#8406) 2021-07-14 10:32:13 +00:00
trainer Quant as optional step (#8464) 2021-07-22 12:44:27 +00:00
tuner rename old `Trainer.train_loop` -> `Trainer.fit_loop` (#8025) 2021-06-22 11:49:32 +02:00
utilities Maintain Backward compatibility for DeviceDtypeModuleMixin (#8474) 2021-07-21 13:13:19 +02:00
__about__.py Merge pull request #8497 from PyTorchLightning/v1.4.0rc1 2021-07-21 09:54:27 -04:00
__init__.py rename about (#7002) 2021-04-14 18:56:40 -04:00
py.typed make PyTorch Lightning PEP 561 Compliant (#3187) 2020-09-09 13:37:03 -04:00
setup_tools.py fix pip install (#7170) 2021-04-22 16:48:11 -04:00