lightning/pytorch_lightning
ananthsub d99625fc8d
Reduce number of times optimizers are instantiated with FSDP (#12267)
2022-03-21 18:18:59 +01:00
..
accelerators add `accelerator.is_available()` check (#12104) 2022-03-02 10:07:49 +00:00
callbacks update docs for ModelCheckpoint save_last (#12332) 2022-03-19 20:15:54 +00:00
core Deprecate `LightningModule.use_amp` (#12315) 2022-03-18 03:49:18 +01:00
distributed Renamed the `DDPSpawnPlugin` to `DDPSpawnStrategy` (#11145) 2021-12-21 23:06:14 +00:00
lite Rewrite accelerator_connector (#11448) 2022-02-17 23:38:39 +00:00
loggers Deprecate `LoggerCollection` in favor of `trainer.loggers` (#12147) 2022-03-04 23:01:43 +00:00
loops check trainerfn == FITTING before configuring sync_batchnorm (#11919) 2022-03-12 03:52:59 +00:00
overrides Fix passing _ddp_params_and_buffers_to_ignore (#11949) 2022-02-24 17:22:48 +00:00
plugins Refactor `TorchElasticEnvironment.detect` to use `torch.distributed.is_torchelastic_launched` (#12376) 2022-03-21 16:51:24 +01:00
profiler Deprecate `AbstractProfiler` in favor of `BaseProfiler` (#12106) 2022-03-05 02:35:57 +00:00
strategies Reduce number of times optimizers are instantiated with FSDP (#12267) 2022-03-21 18:18:59 +01:00
trainer Remove `AcceleratorConnector.num_gpus` and deprecate `Trainer.num_gpus` (#12384) 2022-03-21 18:06:39 +01:00
tuner Integrate global step with progress tracking (#11805) 2022-03-07 19:21:37 +00:00
utilities Refactor `TorchElasticEnvironment.detect` to use `torch.distributed.is_torchelastic_launched` (#12376) 2022-03-21 16:51:24 +01:00
__about__.py Update dev branch to continue with 1.6 (#10332) 2021-11-03 14:07:55 +00:00
__init__.py Add DETAIL logs for batch use cases (#11008) 2022-01-12 01:22:48 +01:00
py.typed
setup_tools.py CI: precommit - docformatter (#8584) 2021-09-06 12:49:09 +00:00