mirror of https://github.com/explosion/spaCy.git
112c5787eb
More robust Hungarian tokenizer. |
||
---|---|---|
.. | ||
__init__.py | ||
lemmatizer.py | ||
punctuation.py | ||
stop_words.py | ||
tokenizer_exceptions.py |