From 63c475c600f3d88aa39a431ac1e621321cdacc94 Mon Sep 17 00:00:00 2001 From: Wouter van Amsterdam Date: Fri, 4 Oct 2019 13:14:30 +0200 Subject: [PATCH] tiny spelling error (#295) --- docs/Trainer/index.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/Trainer/index.md b/docs/Trainer/index.md index 519024aa51..7731e4e4b5 100644 --- a/docs/Trainer/index.md +++ b/docs/Trainer/index.md @@ -36,7 +36,7 @@ But of course the fun is in all the advanced things it can do: - [Log GPU usage](https://williamfalcon.github.io/pytorch-lightning/Trainer/debugging/#Log-gpu-usage) - [Make model overfit on subset of data](https://williamfalcon.github.io/pytorch-lightning/Trainer/debugging/#make-model-overfit-on-subset-of-data) - [Print the parameter count by layer](https://williamfalcon.github.io/pytorch-lightning/Trainer/debugging/#print-the-parameter-count-by-layer) -- [Pring which gradients are nan](https://williamfalcon.github.io/pytorch-lightning/Trainer/debugging/#print-which-gradients-are-nan) +- [Print which gradients are nan](https://williamfalcon.github.io/pytorch-lightning/Trainer/debugging/#print-which-gradients-are-nan) - [Print input and output size of every module in system](https://williamfalcon.github.io/pytorch-lightning/LightningModule/properties/#example_input_array) @@ -84,4 +84,4 @@ But of course the fun is in all the advanced things it can do: **Testing loop** -- [Run test set](https://williamfalcon.github.io/pytorch-lightning/Trainer/Testing%20loop/) \ No newline at end of file +- [Run test set](https://williamfalcon.github.io/pytorch-lightning/Trainer/Testing%20loop/)