lightning/pytorch_lightning
srush 27a3be0287
TPU gradient clipping. (#963)
* clip

* Update pytorch_lightning/trainer/training_tricks.py

Co-Authored-By: Jirka Borovec <Borda@users.noreply.github.com>

* Update pytorch_lightning/trainer/training_tricks.py

Co-Authored-By: Jirka Borovec <Borda@users.noreply.github.com>

* pull out epsilon

* add fp16 case

* Update pytorch_lightning/trainer/training_tricks.py

Co-Authored-By: Jirka Borovec <Borda@users.noreply.github.com>

Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
2020-02-27 15:46:47 -05:00
..
callbacks Callbacks [wip] (#889) 2020-02-25 23:17:27 -05:00
core clean docs (#966) 2020-02-27 07:35:20 -05:00
loggers Add support for multiple loggers (#903) 2020-02-25 14:52:39 -05:00
logging Fix backwards compatibility for optional logging dependencies (#900) 2020-02-21 13:18:27 -05:00
overrides update deprecated messages (#810) 2020-02-11 07:41:15 -05:00
profiler advanced profiler describe + cleaned up tests (#837) 2020-02-15 23:43:43 -05:00
pt_overrides update deprecated messages (#810) 2020-02-11 07:41:15 -05:00
root_module update deprecated messages (#810) 2020-02-11 07:41:15 -05:00
trainer TPU gradient clipping. (#963) 2020-02-27 15:46:47 -05:00
utilities Simplify variables: step, epoch, max_epochs, min_epochs (#589) 2019-12-07 08:50:21 -05:00
__init__.py Callbacks [wip] (#889) 2020-02-25 23:17:27 -05:00