elaborate on the correlation between overfit_pct and xxx_percent_check (#132)

* Update Training Loop.md

* update docs and elaborate on the correlation
This commit is contained in:
Ir1dXD 2019-08-17 22:23:25 +08:00 committed by William Falcon
parent 1a31782272
commit 48de39ed50
3 changed files with 13 additions and 1 deletions

View File

@ -54,7 +54,10 @@ trainer = Trainer(track_grad_norm=2)
---
#### Set how much of the training set to check
If you don't want to check 100% of the training set (for debugging or if it's huge), set this flag
If you don't want to check 100% of the training set (for debugging or if it's huge), set this flag.
train_percent_check will be overwritten by overfit_pct if `overfit_pct > 0`
``` {.python}
# DEFAULT
trainer = Trainer(train_percent_check=1.0)

View File

@ -18,6 +18,9 @@ trainer = Trainer(check_val_every_n_epoch=1)
---
#### Set how much of the validation set to check
If you don't want to check 100% of the validation set (for debugging or if it's huge), set this flag
val_percent_check will be overwritten by overfit_pct if `overfit_pct > 0`
``` {.python}
# DEFAULT
trainer = Trainer(val_percent_check=1.0)
@ -29,6 +32,9 @@ trainer = Trainer(val_percent_check=0.1)
---
#### Set how much of the test set to check
If you don't want to check 100% of the test set (for debugging or if it's huge), set this flag
test_percent_check will be overwritten by overfit_pct if `overfit_pct > 0`
``` {.python}
# DEFAULT
trainer = Trainer(test_percent_check=1.0)

View File

@ -23,6 +23,9 @@ trainer = Trainer(track_grad_norm=2)
---
#### Make model overfit on subset of data
A useful debugging trick is to make your model overfit a tiny fraction of the data.
setting `overfit_pct > 0` will overwrite train_percent_check, val_percent_check, test_percent_check
``` {.python}
# DEFAULT don't overfit (ie: normal training)
trainer = Trainer(overfit_pct=0.0)