tokenizer doc fix

This commit is contained in:
svlandeg 2019-07-15 11:19:34 +02:00
parent c0e29f7029
commit 6026958957
1 changed files with 1 additions and 1 deletions

View File

@ -348,7 +348,7 @@ cdef class Tokenizer:
"""Add a special-case tokenization rule.
string (unicode): The string to specially tokenize.
token_attrs (iterable): A sequence of dicts, where each dict describes
substrings (iterable): A sequence of dicts, where each dict describes
a token and its attributes. The `ORTH` fields of the attributes
must exactly match the string when they are concatenated.