mirror of https://github.com/explosion/spaCy.git
eae35e9b27
- add files to specify rules for German tokenization - change generate_specials.py to generate from an external file (abbrev.de.tab) - copy gazetteer.json from lang_data/en/ - init_model.py - change doc freq threshold to 0 - add train_german_tagger.py - expects conll09-formatted input |
||
---|---|---|
.. | ||
abbrev.de.tab | ||
gazetteer.json | ||
generate_specials.py | ||
infix.txt | ||
lemma_rules.json | ||
morphs.json | ||
prefix.txt | ||
sample.txt | ||
specials.json | ||
suffix.txt | ||
tag_map.json |