updated docs

This commit is contained in:
William Falcon 2020-04-25 13:04:34 -04:00
parent cbd088bd13
commit 1e2c9eaf89
1 changed files with 51 additions and 41 deletions

View File

@ -3,21 +3,27 @@ Lightning supports the most popular logging frameworks (TensorBoard, Comet, Weig
To use a logger, simply pass it into the :class:`~pytorch_lightning.trainer.trainer.Trainer`. To use a logger, simply pass it into the :class:`~pytorch_lightning.trainer.trainer.Trainer`.
Lightning uses TensorBoard by default. Lightning uses TensorBoard by default.
>>> from pytorch_lightning import Trainer .. code-block:: python
>>> from pytorch_lightning import loggers
>>> tb_logger = loggers.TensorBoardLogger('logs/') from pytorch_lightning import Trainer
>>> trainer = Trainer(logger=tb_logger) from pytorch_lightning import loggers
tb_logger = loggers.TensorBoardLogger('logs/')
trainer = Trainer(logger=tb_logger)
Choose from any of the others such as MLflow, Comet, Neptune, WandB, ... Choose from any of the others such as MLflow, Comet, Neptune, WandB, ...
>>> comet_logger = loggers.CometLogger(save_dir='logs/') .. code-block:: python
>>> trainer = Trainer(logger=comet_logger)
comet_logger = loggers.CometLogger(save_dir='logs/')
trainer = Trainer(logger=comet_logger)
To use multiple loggers, simply pass in a ``list`` or ``tuple`` of loggers ... To use multiple loggers, simply pass in a ``list`` or ``tuple`` of loggers ...
>>> tb_logger = loggers.TensorBoardLogger('logs/') .. code-block:: python
>>> comet_logger = loggers.CometLogger(save_dir='logs/')
>>> trainer = Trainer(logger=[tb_logger, comet_logger]) tb_logger = loggers.TensorBoardLogger('logs/')
comet_logger = loggers.CometLogger(save_dir='logs/')
trainer = Trainer(logger=[tb_logger, comet_logger])
Note: Note:
All loggers log by default to ``os.getcwd()``. To change the path without creating a logger set All loggers log by default to ``os.getcwd()``. To change the path without creating a logger set
@ -30,31 +36,33 @@ You can implement your own logger by writing a class that inherits from
:class:`LightningLoggerBase`. Use the :func:`~pytorch_lightning.loggers.base.rank_zero_only` :class:`LightningLoggerBase`. Use the :func:`~pytorch_lightning.loggers.base.rank_zero_only`
decorator to make sure that only the first process in DDP training logs data. decorator to make sure that only the first process in DDP training logs data.
>>> from pytorch_lightning.utilities import rank_zero_only .. code-block:: python
>>> from pytorch_lightning.loggers import LightningLoggerBase
>>> class MyLogger(LightningLoggerBase): from pytorch_lightning.utilities import rank_zero_only
... from pytorch_lightning.loggers import LightningLoggerBase
... @rank_zero_only class MyLogger(LightningLoggerBase):
... def log_hyperparams(self, params):
... # params is an argparse.Namespace @rank_zero_only
... # your code to record hyperparameters goes here def log_hyperparams(self, params):
... pass # params is an argparse.Namespace
... # your code to record hyperparameters goes here
... @rank_zero_only pass
... def log_metrics(self, metrics, step):
... # metrics is a dictionary of metric names and values @rank_zero_only
... # your code to record metrics goes here def log_metrics(self, metrics, step):
... pass # metrics is a dictionary of metric names and values
... # your code to record metrics goes here
... def save(self): pass
... # Optional. Any code necessary to save logger data goes here
... pass def save(self):
... # Optional. Any code necessary to save logger data goes here
... @rank_zero_only pass
... def finalize(self, status):
... # Optional. Any code that needs to be run after training @rank_zero_only
... # finishes goes here def finalize(self, status):
... pass # Optional. Any code that needs to be run after training
# finishes goes here
pass
If you write a logger that may be useful to others, please send If you write a logger that may be useful to others, please send
a pull request to add it to Lighting! a pull request to add it to Lighting!
@ -65,18 +73,20 @@ Using loggers
Call the logger anywhere except ``__init__`` in your Call the logger anywhere except ``__init__`` in your
:class:`~pytorch_lightning.core.lightning.LightningModule` by doing: :class:`~pytorch_lightning.core.lightning.LightningModule` by doing:
>>> from pytorch_lightning import LightningModule .. code-block:: python
>>> class LitModel(LightningModule):
... def training_step(self, batch, batch_idx): from pytorch_lightning import LightningModule
... # example class LitModel(LightningModule):
... self.logger.experiment.whatever_method_summary_writer_supports(...) def training_step(self, batch, batch_idx):
# example
self.logger.experiment.whatever_method_summary_writer_supports(...)
# example if logger is a tensorboard logger # example if logger is a tensorboard logger
self.logger.experiment.add_image('images', grid, 0) self.logger.experiment.add_image('images', grid, 0)
self.logger.experiment.add_graph(model, images) self.logger.experiment.add_graph(model, images)
... def any_lightning_module_function_or_hook(self): def any_lightning_module_function_or_hook(self):
... self.logger.experiment.add_histogram(...) self.logger.experiment.add_histogram(...)
Read more in the `Experiment Logging use case <./experiment_logging.html>`_. Read more in the `Experiment Logging use case <./experiment_logging.html>`_.