Add = to tokenizer prefixes

This commit is contained in:
Ines Montani 2017-02-02 16:21:11 +01:00
parent ff04748eb6
commit 1219a5f513
1 changed files with 1 additions and 1 deletions

View File

@ -72,7 +72,7 @@ HYPHENS = _HYPHENS.strip().replace(' ', '|')
# Prefixes
TOKENIZER_PREFIXES = (
['§', '%', r'\+'] +
['§', '%', '=', r'\+'] +
LIST_PUNCT +
LIST_ELLIPSES +
LIST_QUOTES +