spaCy/examples
Matthew Honnibal d0c999e0ad Add config.py for paddle example 2016-11-20 23:24:51 +01:00
..
inventory_count Rename inventory count example 2016-11-01 02:30:22 +01:00
keras_parikh_entailment Add partial embedding updates to Parikh model, fix dropout, other corrections. 2016-11-18 06:32:12 -06:00
paddle/sentiment_bilstm Add config.py for paddle example 2016-11-20 23:24:51 +01:00
sentiment Work on paddle example 2016-11-20 03:29:36 +01:00
training Fixed training examples 2016-10-24 16:09:23 +10:00
README.md Add README.md to examples 2016-11-01 01:14:04 +01:00
_handler.py
chainer_sentiment.py Fix embedding in chainer sentiment example 2016-11-19 19:05:37 +01:00
deep_learning_keras.py Make deep_learning_keras example use sentences 2016-10-23 23:17:41 +02:00
get_parse_subregions.py
information_extraction.py
matcher_example.py
multi_word_matches.py
nn_text_class.py Add setup directions for data dir 2016-11-13 10:08:16 -08:00
parallel_parse.py added batch_size as keyword argument 2016-03-10 14:16:34 -08:00
pos_tag.py Fix bug in pos_tag.py script 2016-03-16 06:04:14 +11:00
spacy_dynet_lstm.py Minibatch the forward pass. THe output argmax is incorrect... 2016-11-16 06:15:28 -06:00
twitter_filter.py

README.md

spaCy examples

The examples are Python scripts with well-behaved command line interfaces. For a full list of spaCy tutorials and code snippets, see the documentation.

How to run an example

For example, to run the nn_text_class.py script, do:

$ python examples/nn_text_class.py
usage: nn_text_class.py [-h] [-d 3] [-H 300] [-i 5] [-w 40000] [-b 24]
                        [-r 0.3] [-p 1e-05] [-e 0.005]
                        data_dir
nn_text_class.py: error: too few arguments

You can print detailed help with the -h argument.

While we try to keep the examples up to date, they are not currently exercised by the test suite, as some of them require significant data downloads or take time to train. If you find that an example is no longer running, please tell us! We know there's nothing worse than trying to figure out what you're doing wrong, and it turns out your code was never the problem.