From 09d4475cc714f12ba88d7dc28c573a2680b889b7 Mon Sep 17 00:00:00 2001 From: Lorenzo Fabbri Date: Fri, 9 Aug 2019 21:02:36 +0200 Subject: [PATCH] Update Checkpointing.md (#83) * Update Checkpointing.md Modified import for ModelCheckpoint. * Update Checkpointing.md --- docs/Trainer/Checkpointing.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/docs/Trainer/Checkpointing.md b/docs/Trainer/Checkpointing.md index 8c37764cdf..ad0c015837 100644 --- a/docs/Trainer/Checkpointing.md +++ b/docs/Trainer/Checkpointing.md @@ -1,11 +1,11 @@ -Lightning can automate saving and loading checkpoints. +i Lightning can automate saving and loading checkpoints. --- ### Model saving To enable checkpointing, define the checkpoint callback and give it to the trainer. ``` {.python} -from pytorch_lightning.utils.pt_callbacks import ModelCheckpoint +from pytorch_lightning.callbacks import ModelCheckpoint checkpoint_callback = ModelCheckpoint( filepath='/path/to/store/weights.ckpt', @@ -65,4 +65,4 @@ for scheduler, lrs_state in zip(self.lr_schedulers, lr_schedulers): # uses the model you passed into trainer model.load_state_dict(checkpoint['state_dict']) -``` \ No newline at end of file +```