diff --git a/README.md b/README.md index 629a5ed1cb..33480dc4b4 100644 --- a/README.md +++ b/README.md @@ -324,6 +324,7 @@ tensorboard --logdir /some/path - [Accumulate gradients](https://williamfalcon.github.io/pytorch-lightning/Trainer/Training%20Loop/#accumulated-gradients) - [Force training for min or max epochs](https://williamfalcon.github.io/pytorch-lightning/Trainer/Training%20Loop/#force-training-for-min-or-max-epochs) +- [Early stopping](https://williamfalcon.github.io/pytorch-lightning/Trainer/Training%20Loop/#early-stopping) - [Force disable early stop](https://williamfalcon.github.io/pytorch-lightning/Trainer/Training%20Loop/#force-disable-early-stop) - [Gradient Clipping](https://williamfalcon.github.io/pytorch-lightning/Trainer/Training%20Loop/#gradient-clipping) - [Hooks](https://williamfalcon.github.io/pytorch-lightning/Trainer/hooks/) diff --git a/docs/Trainer/Training Loop.md b/docs/Trainer/Training Loop.md index f7ad91da4b..2779f869a4 100644 --- a/docs/Trainer/Training Loop.md +++ b/docs/Trainer/Training Loop.md @@ -19,6 +19,24 @@ It can be useful to force training for a minimum number of epochs or limit to a trainer = Trainer(min_nb_epochs=1, max_nb_epochs=1000) ``` +--- +#### Early stopping +To enable ealry-stopping, define the callback and give it to the trainer. +``` {.python} +from pytorch_lightning.callbacks import EarlyStopping + +# DEFAULTS +early_stop_callback = EarlyStopping( + monitor='val_loss', + min_delta=0.00, + patience=0, + verbose=False, + mode='auto' +) + +trainer = Trainer(early_stop_callback=early_stop_callback) +``` + --- #### Force disable early stop Use this to turn off early stopping and run training to the [max_epoch](#force-training-for-min-or-max-epochs) diff --git a/docs/index.md b/docs/index.md index c163b0f409..cebbd158c9 100644 --- a/docs/index.md +++ b/docs/index.md @@ -120,6 +120,7 @@ Notice a few things about this flow: - [Accumulate gradients](https://williamfalcon.github.io/pytorch-lightning/Trainer/Training%20Loop/#accumulated-gradients) - [Force training for min or max epochs](https://williamfalcon.github.io/pytorch-lightning/Trainer/Training%20Loop/#force-training-for-min-or-max-epochs) +- [Early stopping](https://williamfalcon.github.io/pytorch-lightning/Trainer/Training%20Loop/#early-stopping) - [Force disable early stop](https://williamfalcon.github.io/pytorch-lightning/Trainer/Training%20Loop/#force-disable-early-stop) - [Gradient Clipping](https://williamfalcon.github.io/pytorch-lightning/Trainer/Training%20Loop/#gradient-clipping) - [Hooks](https://williamfalcon.github.io/pytorch-lightning/Trainer/hooks/)