lightning/pytorch_lightning/trainer
Adrian Wälchli 8211256c46
data transfer model hook (+ refactor) (#1756)
* refactor and added hook


variant a


variant b


add test


revert rename


add changelog


docs

* resolve merge duplication

* overridden typo

* fix test

* tpu id

* raise if TPU not available

* re-use apply_to_collection function for parsing collections

* comment

* make utility function available to user

* documentation

* move changelog entry to top

* fix tpu transfer call

* fix call

* remove hardcoded string

* improve test

* call model hook by default

* Apply suggestions from code review

* rename utility function

Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
2020-06-02 21:45:19 -04:00
..
__init__.py Update/merge multi-gpu docs (#2021) 2020-06-02 18:50:08 -04:00
auto_mix_precision.py support for native amp (#1561) 2020-04-23 14:47:08 -04:00
callback_config.py protect progress bar callback (#1855) 2020-05-25 07:49:23 -04:00
callback_hook.py remove extra kwargs from Trainer init (#1820) 2020-05-17 09:14:54 -04:00
data_loading.py Raise an error when lightning replaces an existing sampler (#2020) 2020-06-02 18:52:04 -04:00
deprecated_api.py protect progress bar callback (#1855) 2020-05-25 07:49:23 -04:00
distrib_data_parallel.py Replaces ddp .spawn with subprocess (#2029) 2020-06-01 11:00:32 -04:00
distrib_parts.py data transfer model hook (+ refactor) (#1756) 2020-06-02 21:45:19 -04:00
evaluation_loop.py data transfer model hook (+ refactor) (#1756) 2020-06-02 21:45:19 -04:00
ignored_warnings.py Dim 0 warning (#256) 2019-09-26 13:20:54 -04:00
logging.py [WIP] Reduction when batch size < num gpus (#1609) 2020-05-02 11:01:44 -04:00
lr_finder.py protect progress bar callback (#1855) 2020-05-25 07:49:23 -04:00
model_hooks.py Fix typo (#1750) 2020-05-07 09:25:54 -04:00
optimizers.py Fix user warning produced by apex + scheduler combination (#1873) 2020-05-22 07:19:37 -04:00
seed.py Option to provide seed to random generators to ensure reproducibility (#1572) 2020-05-12 07:53:20 -04:00
supporters.py Fixed configure optimizer from dict without "scheduler" key (#1443) 2020-04-10 11:43:06 -04:00
trainer.py Mistake in parameters' grad norm tracking (#2012) 2020-06-02 18:51:09 -04:00
training_io.py Keep track of the best model's path saved by ModelCheckpoint (#1799) 2020-05-31 08:47:13 -04:00
training_loop.py data transfer model hook (+ refactor) (#1756) 2020-06-02 21:45:19 -04:00
training_tricks.py replace Hparams by init args (#1896) 2020-05-24 18:59:08 -04:00