lightning/pytorch_lightning
Maxim Ostroukhov c208ac68c8
Added experiment_id to NeptuneLogger (#3462)
* 1) Added experiment_id to NeptuneLogger initialization input arguments.
2) Now function _create_or_get_experiment() overrides "experiment_name", "params", "properties", "tags".

* Added test case for existing experiment.

* Revert "Added test case for existing experiment."

This reverts commit 9f3ba2e37b.

* Added test case for existing experiment.

* Fix merging issue.

* Moved experiment_id assignment directly to the part with experiment initialization.

* Update pytorch_lightning/loggers/neptune.py

Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
2020-11-16 23:50:23 +05:30
..
accelerators [FEAT] Add lambda closure to manual_optimizer_step (#4618) 2020-11-12 19:22:06 +00:00
callbacks logger docs and api docs (#3950) 2020-11-13 20:35:54 +05:30
cluster_environments ref: unify slurm and TE under backendPlugin 1/n (#4578) 2020-11-08 14:28:55 -05:00
core allow decorate model init with saving hparams (#4662) 2020-11-16 11:02:26 +01:00
distributed notices (#4118) 2020-10-13 07:18:07 -04:00
loggers Added experiment_id to NeptuneLogger (#3462) 2020-11-16 23:50:23 +05:30
metrics [metrics] change default behaviour of state dict (#4685) 2020-11-16 12:33:45 +00:00
overrides removed support for EvalResult and TrainResult (#3968) 2020-10-07 22:39:16 -04:00
plugins Sharded Accelerator 1/n: Expose clip gradients to plugins via abstract class (#4639) 2020-11-12 17:18:09 +00:00
profiler logger docs and api docs (#3950) 2020-11-13 20:35:54 +05:30
trainer quick fix (#4697) 2020-11-16 16:20:35 +00:00
tuner Skip tuner algorithms on fast dev (#3903) 2020-11-10 00:34:42 +01:00
utilities allow decorate model init with saving hparams (#4662) 2020-11-16 11:02:26 +01:00
__init__.py Update __init__.py 2020-10-27 18:15:26 -04:00
py.typed make PyTorch Lightning PEP 561 Compliant (#3187) 2020-09-09 13:37:03 -04:00