lightning/pytorch_lightning/core
Carlos Mocholí 89cc12311f
Fix tbptt_reduce_fx when non-floating tensors are logged (#3796)
* Add failing test

* force all tbptt vals to be floats for reduce

Co-authored-by: William Falcon <waf2107@columbia.edu>
2020-10-04 17:10:25 -04:00
..
__init__.py document lightiningmodule better (#2920) 2020-08-11 19:39:43 -04:00
datamodule.py Add trainer attribute to datamodule (#3749) 2020-10-01 00:41:19 +05:30
decorators.py added copyright notices (#3062) 2020-08-19 22:03:22 -04:00
grads.py added copyright notices (#3062) 2020-08-19 22:03:22 -04:00
hooks.py Support checkpoint hooks on data module (#3563) 2020-09-29 19:51:44 +02:00
lightning.py ref: callback system and init ddp (1/n) (#3836) 2020-10-03 23:39:17 -04:00
memory.py Refactor GPUStatsMonitor to improve training speed (#3257) 2020-09-04 06:02:16 -04:00
saving.py Use fsspec with OmegaConf saving in saving.py (#3782) 2020-10-02 15:37:37 +02:00
step_result.py Fix tbptt_reduce_fx when non-floating tensors are logged (#3796) 2020-10-04 17:10:25 -04:00