From 3449c45fd905393d6e94866d66a00ec8d62f6880 Mon Sep 17 00:00:00 2001 From: Ines Montani Date: Wed, 29 Jul 2020 19:48:26 +0200 Subject: [PATCH] Update docs [ci skip] --- website/docs/usage/training.md | 13 ++++++++++++- website/docs/usage/transformers.md | 7 ++++--- 2 files changed, 16 insertions(+), 4 deletions(-) diff --git a/website/docs/usage/training.md b/website/docs/usage/training.md index 948a13086..12785b6de 100644 --- a/website/docs/usage/training.md +++ b/website/docs/usage/training.md @@ -243,7 +243,14 @@ compound = 1.001 ### Using transformer models like BERT {#transformers} - +spaCy v3.0 lets you use almost any statistical model to power your pipeline. You +can use models implemented in a variety of frameworks. A transformer model is +just a statistical model, so the +[`spacy-transformers`](https://github.com/explosion/spacy-transformers) package +actually has very little work to do: it just has to provide a few functions that +do the required plumbing. It also provides a pipeline component, +[`Transformer`](/api/transformer), that lets you do multi-task learning and lets +you save the transformer outputs for later use. @@ -253,6 +260,10 @@ visualize your model. +For more details on how to integrate transformer models into your training +config and customize the implementations, see the usage guide on +[training transformers](/usage/transformers#training). + ### Pretraining with spaCy {#pretraining} diff --git a/website/docs/usage/transformers.md b/website/docs/usage/transformers.md index a7fd83ac6..bab1b82d3 100644 --- a/website/docs/usage/transformers.md +++ b/website/docs/usage/transformers.md @@ -18,8 +18,8 @@ frameworks to be wrapped with a common interface, using our machine learning library [Thinc](https://thinc.ai). A transformer model is just a statistical model, so the [`spacy-transformers`](https://github.com/explosion/spacy-transformers) package -actually has very little work to do: we just have to provide a few functions -that do the required plumbing. We also provide a pipeline component, +actually has very little work to do: it just has to provide a few functions that +do the required plumbing. It also provides a pipeline component, [`Transformer`](/api/transformer), that lets you do multi-task learning and lets you save the transformer outputs for later use. @@ -201,7 +201,8 @@ def configure_custom_sent_spans(): To resolve the config during training, spaCy needs to know about your custom function. You can make it available via the `--code` argument that can point to -a Python file: +a Python file. For more details on training with custom code, see the +[training documentation](/usage/training#custom-code). ```bash $ python -m spacy train ./train.spacy ./dev.spacy ./config.cfg --code ./code.py