lightning/pytorch_lightning
ananthsub b39f4798a6
Add support to Tensorboard logger for OmegaConf hparams (#2846)
* Add support to Tensorboard logger for OmegaConf hparams

Address https://github.com/PyTorchLightning/pytorch-lightning/issues/2844

We check if we can import omegaconf, and if the hparams are omegaconf instances. if so, we use OmegaConf.merge to preserve the typing, such that saving hparams to yaml actually triggers the OmegaConf branch

* avalaible

* chlog

* test

Co-authored-by: Jirka Borovec <jirka@pytorchlightning.ai>
2020-08-07 09:13:21 -04:00
..
accelerator_backends
callbacks clarify batch hooks (#2842) 2020-08-05 20:01:30 -04:00
core Add support to Tensorboard logger for OmegaConf hparams (#2846) 2020-08-07 09:13:21 -04:00
loggers Add support to Tensorboard logger for OmegaConf hparams (#2846) 2020-08-07 09:13:21 -04:00
metrics Faster Accuracy metric (#2775) 2020-08-06 11:40:35 +02:00
overrides Support returning python scalars in DP (#1935) 2020-08-07 09:18:29 +02:00
profiler
trainer Add support to Tensorboard logger for OmegaConf hparams (#2846) 2020-08-07 09:13:21 -04:00
utilities Bugfix: Lr finder and hparams compatibility (#2821) 2020-08-07 00:34:48 +02:00
__init__.py Update __init__.py 2020-08-05 20:45:11 -04:00