From 7119295a8aa05bc481e53af2e861103793f698fc Mon Sep 17 00:00:00 2001 From: svlandeg Date: Wed, 19 Aug 2020 17:53:22 +0200 Subject: [PATCH] badgers intro --- website/docs/api/top-level.md | 12 +++++++++++- website/docs/usage/training.md | 2 +- 2 files changed, 12 insertions(+), 2 deletions(-) diff --git a/website/docs/api/top-level.md b/website/docs/api/top-level.md index 48496bfd1..2398cb632 100644 --- a/website/docs/api/top-level.md +++ b/website/docs/api/top-level.md @@ -340,7 +340,17 @@ See the [`Transformer`](/api/transformer) API reference and ## Batchers {#batchers source="spacy/gold/batchers.py" new="3"} - +A batcher implements a batching strategy that essentially turns a stream of +items into a stream of batches, with each batch consisting of one item or a list +of items. During training, the models update their weights after processing one +batch at a time. Typical batching strategies include presenting the training +data as a stream of batches with similar sizes, or with increasing batch sizes. +See the Thinc documentation on +[`schedules`](https://thinc.ai/docs/api-schedules) for a few standard examples. + +Instead of using one of the built-in batchers listed here, you can also +[implement your own](/usage/training#custom-code-readers-batchers), which may or +may not use a custom schedule. #### batch_by_words.v1 {#batch_by_words tag="registered function"} diff --git a/website/docs/usage/training.md b/website/docs/usage/training.md index 259d3bb39..c4eaf1d88 100644 --- a/website/docs/usage/training.md +++ b/website/docs/usage/training.md @@ -743,7 +743,7 @@ the annotations are exactly the same. ```python ### functions.py -from typing import Callable, Iterable, Iterator +from typing import Callable, Iterable, Iterator, List import spacy from spacy.gold import Example