Update docs [ci skip]

This commit is contained in:
Ines Montani 2020-10-15 08:58:30 +02:00
parent bc85b12e6d
commit abeafcbc08
2 changed files with 33 additions and 8 deletions

View File

@ -24,8 +24,7 @@ import { Help } from 'components/typography'; import Link from 'components/link'
| Named Entity Recognition System | OntoNotes | CoNLL '03 |
| ------------------------------------------------------------------------------ | --------: | --------: |
| spaCy RoBERTa (2020) | 89.7 | 91.6 |
| spaCy CNN (2020) | 84.5 | |
| spaCy CNN (2017) | | |
| spaCy CNN (2020) | 84.5 | 87.4 |
| [Stanza](https://stanfordnlp.github.io/stanza/) (StanfordNLP)<sup>1</sup> | 88.8 | 92.1 |
| <Link to="https://github.com/flairNLP/flair" hideIcon>Flair</Link><sup>2</sup> | 89.7 | 93.1 |
| BERT Base<sup>3</sup> | - | 92.4 |
@ -36,9 +35,11 @@ import { Help } from 'components/typography'; import Link from 'components/link'
[OntoNotes 5.0](https://catalog.ldc.upenn.edu/LDC2013T19) and
[CoNLL-2003](https://www.aclweb.org/anthology/W03-0419.pdf) corpora. See
[NLP-progress](http://nlpprogress.com/english/named_entity_recognition.html) for
more results. **1. ** [Qi et al. (2020)](https://arxiv.org/pdf/2003.07082.pdf).
**2. ** [Akbik et al. (2018)](https://www.aclweb.org/anthology/C18-1139/). **3.
** [Devlin et al. (2018)](https://arxiv.org/abs/1810.04805).
more results. Project template:
[`benchmarks/ner_conll03`](%%GITHUB_PROJECTS/benchmarks/ner_conll03). **1. **
[Qi et al. (2020)](https://arxiv.org/pdf/2003.07082.pdf). **2. **
[Akbik et al. (2018)](https://www.aclweb.org/anthology/C18-1139/). **3. **
[Devlin et al. (2018)](https://arxiv.org/abs/1810.04805).
</figcaption>

View File

@ -65,8 +65,8 @@ import Benchmarks from 'usage/\_benchmarks-models.md'
| Dependency Parsing System | UAS | LAS |
| ------------------------------------------------------------------------------ | ---: | ---: |
| spaCy RoBERTa (2020)<sup>1</sup> | 95.5 | 94.3 |
| spaCy CNN (2020)<sup>1</sup> | | |
| spaCy RoBERTa (2020) | 95.5 | 94.3 |
| spaCy CNN (2020) | | |
| [Mrini et al.](https://khalilmrini.github.io/Label_Attention_Layer.pdf) (2019) | 97.4 | 96.3 |
| [Zhou and Zhao](https://www.aclweb.org/anthology/P19-1230/) (2019) | 97.2 | 95.7 |
@ -74,13 +74,37 @@ import Benchmarks from 'usage/\_benchmarks-models.md'
**Dependency parsing accuracy** on the Penn Treebank. See
[NLP-progress](http://nlpprogress.com/english/dependency_parsing.html) for more
results. **1. ** Project template:
results. Project template:
[`benchmarks/parsing_penn_treebank`](%%GITHUB_PROJECTS/benchmarks/parsing_penn_treebank).
</figcaption>
</figure>
### Speed comparison {#benchmarks-speed}
<!-- TODO: intro -->
<figure>
| Library | Pipeline | WPS CPU <Help>words per second on CPU, higher is better</Help> | WPS GPU <Help>words per second on GPU, higher is better</Help> |
| ------- | ----------------------------------------------- | -------------------------------------------------------------: | -------------------------------------------------------------: |
| spaCy | [`en_core_web_md`](/models/en#en_core_web_md) |
| spaCy | [`en_core_web_trf`](/models/en#en_core_web_trf) |
| Stanza | `en_ewt` | |
| Flair | `pos-fast_ner-fast` |
| Flair | `pos_ner` |
| UDPipe | `english-ewt-ud-2.5` |
<figcaption class="caption">
**End-to-end processing speed** on raw unannotated text. Project template:
[`benchmarks/speed`](%%GITHUB_PROJECTS/benchmarks/speed).
</figcaption>
</figure>
<!-- TODO: ## Citing spaCy {#citation}
-->