lightning/pytorch_lightning/trainer
William Falcon 460ab5485e
Gen ddp support (#1961)
* updated docs

* added mixed

* added mixed
2020-05-26 19:02:30 -04:00
..
__init__.py Allow user to select individual TPU core to train on (#1729) 2020-05-17 16:30:54 -04:00
auto_mix_precision.py support for native amp (#1561) 2020-04-23 14:47:08 -04:00
callback_config.py protect progress bar callback (#1855) 2020-05-25 07:49:23 -04:00
callback_hook.py remove extra kwargs from Trainer init (#1820) 2020-05-17 09:14:54 -04:00
data_loading.py add warning for shuffling in test/val (#1865) 2020-05-18 09:53:02 -04:00
deprecated_api.py protect progress bar callback (#1855) 2020-05-25 07:49:23 -04:00
distrib_data_parallel.py Fix user warning produced by apex + scheduler combination (#1873) 2020-05-22 07:19:37 -04:00
distrib_parts.py Remove unused param tpu_core_idx (#1948) 2020-05-25 16:04:53 -04:00
evaluation_loop.py Allow user to select individual TPU core to train on (#1729) 2020-05-17 16:30:54 -04:00
ignored_warnings.py Dim 0 warning (#256) 2019-09-26 13:20:54 -04:00
logging.py [WIP] Reduction when batch size < num gpus (#1609) 2020-05-02 11:01:44 -04:00
lr_finder.py protect progress bar callback (#1855) 2020-05-25 07:49:23 -04:00
model_hooks.py Fix typo (#1750) 2020-05-07 09:25:54 -04:00
optimizers.py Fix user warning produced by apex + scheduler combination (#1873) 2020-05-22 07:19:37 -04:00
seed.py Option to provide seed to random generators to ensure reproducibility (#1572) 2020-05-12 07:53:20 -04:00
supporters.py Fixed configure optimizer from dict without "scheduler" key (#1443) 2020-04-10 11:43:06 -04:00
trainer.py Gen ddp support (#1961) 2020-05-26 19:02:30 -04:00
training_io.py replace Hparams by init args (#1896) 2020-05-24 18:59:08 -04:00
training_loop.py early stopping checks on_validation_end (#1458) 2020-05-25 17:33:00 +00:00
training_tricks.py replace Hparams by init args (#1896) 2020-05-24 18:59:08 -04:00