mirror of https://github.com/explosion/spaCy.git
3fd3531e12
* use TransformerModel.v2 in quickstart * update docs for new transformer architectures * bump spacy_transformers to 1.1.0 * Add new arguments spacy-transformers.TransformerModel.v3 * Mention that mixed-precision support is experimental * Describe delta transformers.Tok2VecTransformer versions * add dot * add dot, again * Update some more TransformerModel references v2 -> v3 * Add mixed-precision options to the training quickstart Disable mixed-precision training/prediction by default. * Update setup.cfg Co-authored-by: Adriane Boyd <adrianeboyd@gmail.com> * Apply suggestions from code review Co-authored-by: Adriane Boyd <adrianeboyd@gmail.com> * Update website/docs/usage/embeddings-transformers.md Co-authored-by: Adriane Boyd <adrianeboyd@gmail.com> Co-authored-by: Daniël de Kok <me@danieldk.eu> Co-authored-by: Adriane Boyd <adrianeboyd@gmail.com> |
||
---|---|---|
.. | ||
101 | ||
_benchmarks-models.md | ||
embeddings-transformers.md | ||
facts-figures.md | ||
index.md | ||
layers-architectures.md | ||
linguistic-features.md | ||
models.md | ||
processing-pipelines.md | ||
projects.md | ||
rule-based-matching.md | ||
saving-loading.md | ||
spacy-101.md | ||
training.md | ||
v2-1.md | ||
v2-2.md | ||
v2-3.md | ||
v2.md | ||
v3-1.md | ||
v3.md | ||
visualizers.md |