Fix typo in `log_graph` docs (#19254)
This commit is contained in:
parent
7a4b0fc433
commit
1bd27447d9
|
@ -15,7 +15,7 @@
|
|||
- `PR9699`_
|
||||
|
||||
* - used Trainer’s flag ``checkpoint_callback``
|
||||
- set ``enable_checkpointing``. If you set ``enable_checkpointing=True``, it configures a default ``ModelCheckpoint`` callback if none is provided ``lightning_pytorch.trainer.trainer.Trainer.callbacks.ModelCheckpoint``
|
||||
- set ``enable_checkpointing``. If you set ``enable_checkpointing=True``, it configures a default ``ModelCheckpoint`` callback if none is provided ``lightning.pytorch.trainer.trainer.Trainer.callbacks.ModelCheckpoint``
|
||||
- `PR9754`_
|
||||
|
||||
* - used Trainer’s flag ``stochastic_weight_avg``
|
||||
|
|
|
@ -61,7 +61,7 @@ Multiple loggers support visualizing the model topology. Here's an example that
|
|||
.. code-block:: python
|
||||
|
||||
def any_lightning_module_function_or_hook(self):
|
||||
tensorboard_logger = self.logger.experiment
|
||||
tensorboard_logger = self.logger
|
||||
|
||||
prototype_array = torch.Tensor(32, 1, 28, 27)
|
||||
tensorboard_logger.log_graph(model=self, input_array=prototype_array)
|
||||
|
|
Loading…
Reference in New Issue