2020-07-03 14:48:21 +00:00
---
title: Model Architectures
teaser: Pre-defined model architectures included with the core library
source: spacy/ml/models
2020-07-28 18:33:52 +00:00
menu:
- ['Tok2Vec', 'tok2vec']
2020-07-29 09:36:42 +00:00
- ['Transformers', 'transformers']
2020-07-28 18:33:52 +00:00
- ['Parser & NER', 'parser']
2020-07-31 15:02:54 +00:00
- ['Tagging', 'tagger']
2020-07-28 18:33:52 +00:00
- ['Text Classification', 'textcat']
- ['Entity Linking', 'entitylinker']
2020-07-03 14:48:21 +00:00
---
2020-07-08 11:34:35 +00:00
TODO: intro and how architectures work, link to
[`registry` ](/api/top-level#registry ),
[custom models ](/usage/training#custom-models ) usage etc.
2020-07-29 09:36:42 +00:00
## Tok2Vec architectures {#tok2vec source="spacy/ml/models/tok2vec.py"}
2020-07-08 11:34:35 +00:00
2020-07-28 18:33:52 +00:00
### spacy.HashEmbedCNN.v1 {#HashEmbedCNN}
2020-07-31 15:02:54 +00:00
<!-- TODO: intro -->
> #### Example Config
>
> ```ini
> [model]
> @architectures = "spacy.HashEmbedCNN.v1"
> # TODO: ...
>
> [model.tok2vec]
> # ...
> ```
| Name | Type | Description |
| -------------------- | ----- | ----------- |
| `width` | int | |
| `depth` | int | |
| `embed_size` | int | |
| `window_size` | int | |
| `maxout_pieces` | int | |
| `subword_features` | bool | |
| `dropout` | float | |
| `pretrained_vectors` | bool | |
2020-07-28 18:33:52 +00:00
### spacy.HashCharEmbedCNN.v1 {#HashCharEmbedCNN}
### spacy.HashCharEmbedBiLSTM.v1 {#HashCharEmbedBiLSTM}
2020-07-29 09:36:42 +00:00
## Transformer architectures {#transformers source="github.com/explosion/spacy-transformers/blob/master/spacy_transformers/architectures.py"}
2020-07-29 17:41:34 +00:00
The following architectures are provided by the package
[`spacy-transformers` ](https://github.com/explosion/spacy-transformers ). See the
[usage documentation ](/usage/transformers ) for how to integrate the
architectures into your training config.
2020-07-29 09:36:42 +00:00
### spacy-transformers.TransformerModel.v1 {#TransformerModel}
2020-07-29 17:41:34 +00:00
<!-- TODO: description -->
> #### Example Config
>
> ```ini
> [model]
> @architectures = "spacy-transformers.TransformerModel.v1"
> name = "roberta-base"
> tokenizer_config = {"use_fast": true}
>
> [model.get_spans]
> @span_getters = "strided_spans.v1"
> window = 128
> stride = 96
> ```
| Name | Type | Description |
| ------------------ | ---------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `name` | str | Any model name that can be loaded by [`transformers.AutoModel` ](https://huggingface.co/transformers/model_doc/auto.html#transformers.AutoModel ). |
| `get_spans` | `Callable` | Function that takes a batch of [`Doc` ](/api/doc ) object and returns lists of [`Span` ](/api ) objects to process by the transformer. [See here ](/api/transformer#span_getters ) for built-in options and examples. |
| `tokenizer_config` | `Dict[str, Any]` | Tokenizer settings passed to [`transformers.AutoTokenizer` ](https://huggingface.co/transformers/model_doc/auto.html#transformers.AutoTokenizer ). |
2020-07-29 16:45:12 +00:00
### spacy-transformers.Tok2VecListener.v1 {#Tok2VecListener}
2020-07-29 16:44:10 +00:00
2020-07-29 17:41:34 +00:00
<!-- TODO: description -->
> #### Example Config
>
> ```ini
> [model]
> @architectures = "spacy-transformers.Tok2VecListener.v1"
> grad_factor = 1.0
>
> [model.pooling]
> @layers = "reduce_mean.v1"
> ```
| Name | Type | Description |
| ------------- | ------------------------- | ---------------------------------------------------------------------------------------------- |
| `grad_factor` | float | Factor for weighting the gradient if multiple components listen to the same transformer model. |
| `pooling` | `Model[Ragged, Floats2d]` | Pooling layer to determine how the vector for each spaCy token will be computed. |
2020-07-28 18:33:52 +00:00
## Parser & NER architectures {#parser source="spacy/ml/models/parser.py"}
### spacy.TransitionBasedParser.v1 {#TransitionBasedParser}
2020-07-08 11:34:35 +00:00
> #### Example Config
>
> ```ini
> [model]
> @architectures = "spacy.TransitionBasedParser.v1"
> nr_feature_tokens = 6
> hidden_width = 64
> maxout_pieces = 2
>
> [model.tok2vec]
> # ...
> ```
| Name | Type | Description |
| ------------------- | ------------------------------------------ | ----------- |
| `tok2vec` | [`Model` ](https://thinc.ai/docs/api-model ) | |
| `nr_feature_tokens` | int | |
| `hidden_width` | int | |
| `maxout_pieces` | int | |
| `use_upper` | bool | |
| `nO` | int | |
2020-07-28 18:33:52 +00:00
2020-07-31 15:02:54 +00:00
## Tagging architectures {#tagger source="spacy/ml/models/tagger.py"}
### spacy.Tagger.v1 {#Tagger}
<!-- TODO: intro -->
> #### Example Config
>
> ```ini
> [model]
> @architectures = "spacy.Tagger.v1"
> nO = null
>
> [model.tok2vec]
> # ...
> ```
| Name | Type | Description |
| --------- | ------------------------------------------ | ----------- |
| `tok2vec` | [`Model` ](https://thinc.ai/docs/api-model ) | |
| `nO` | int | |
2020-07-28 18:33:52 +00:00
## Text classification architectures {#textcat source="spacy/ml/models/textcat.py"}
### spacy.TextCatEnsemble.v1 {#TextCatEnsemble}
### spacy.TextCatBOW.v1 {#TextCatBOW}
### spacy.TextCatCNN.v1 {#TextCatCNN}
### spacy.TextCatLowData.v1 {#TextCatLowData}
## Entity linking architectures {#entitylinker source="spacy/ml/models/entity_linker.py"}
### spacy.EntityLinker.v1 {#EntityLinker}
2020-07-31 15:02:54 +00:00
<!-- TODO: intro -->
> #### Example Config
>
> ```ini
> [model]
> @architectures = "spacy.EntityLinker.v1"
> nO = null
>
> [model.tok2vec]
> # ...
> ```
| Name | Type | Description |
| --------- | ------------------------------------------ | ----------- |
| `tok2vec` | [`Model` ](https://thinc.ai/docs/api-model ) | |
| `nO` | int | |