Port over changes and add note on compat (see #1445)

This commit is contained in:
ines 2017-11-06 13:58:34 +01:00
parent 2d85ee6b5d
commit c646365e2f
1 changed files with 19 additions and 14 deletions

View File

@ -3,16 +3,21 @@
# A decomposable attention model for Natural Language Inference
**by Matthew Honnibal, [@honnibal](https://github.com/honnibal)**
> ⚠️ **IMPORTANT NOTE:** This example is currently only compatible with spaCy
> v1.x. We're working on porting the example over to Keras v2.x and spaCy v2.x.
> See [#1445](https://github.com/explosion/spaCy/issues/1445) for details
> contributions welcome!
This directory contains an implementation of the entailment prediction model described
by [Parikh et al. (2016)](https://arxiv.org/pdf/1606.01933.pdf). The model is notable
by [Parikh et al. (2016)](https://arxiv.org/pdf/1606.01933.pdf). The model is notable
for its competitive performance with very few parameters.
The model is implemented using [Keras](https://keras.io/) and [spaCy](https://spacy.io).
Keras is used to build and train the network. spaCy is used to load
the [GloVe](http://nlp.stanford.edu/projects/glove/) vectors, perform the
feature extraction, and help you apply the model at run-time. The following
demo code shows how the entailment model can be used at runtime, once the
hook is installed to customise the `.similarity()` method of spaCy's `Doc`
The model is implemented using [Keras](https://keras.io/) and [spaCy](https://spacy.io).
Keras is used to build and train the network. spaCy is used to load
the [GloVe](http://nlp.stanford.edu/projects/glove/) vectors, perform the
feature extraction, and help you apply the model at run-time. The following
demo code shows how the entailment model can be used at runtime, once the
hook is installed to customise the `.similarity()` method of spaCy's `Doc`
and `Span` objects:
```python
@ -37,27 +42,27 @@ lots of ways to extend the model.
| File | Description |
| --- | --- |
| `__main__.py` | The script that will be executed. Defines the CLI, the data reading, etc — all the boring stuff. |
| `__main__.py` | The script that will be executed. Defines the CLI, the data reading, etc — all the boring stuff. |
| `spacy_hook.py` | Provides a class `SimilarityShim` that lets you use an arbitrary function to customize spaCy's `doc.similarity()` method. Instead of the default average-of-vectors algorithm, when you call `doc1.similarity(doc2)`, you'll get the result of `your_model(doc1, doc2)`. |
| `keras_decomposable_attention.py` | Defines the neural network model. |
## Setting up
First, install [Keras](https://keras.io/), [spaCy](https://spacy.io) and the spaCy
First, install [Keras](https://keras.io/), [spaCy](https://spacy.io) and the spaCy
English models (about 1GB of data):
```bash
pip install https://github.com/fchollet/keras/archive/master.zip
pip install https://github.com/fchollet/keras/archive/1.2.2.zip
pip install spacy
python -m spacy.en.download
```
⚠️ **Important:** In order for the example to run, you'll need to install Keras from
the master branch (and not via `pip install keras`). For more info on this, see
⚠️ **Important:** In order for the example to run, you'll need to install Keras from
the 1.2.2 release (and not via `pip install keras`). For more info on this, see
[#727](https://github.com/explosion/spaCy/issues/727).
You'll also want to get Keras working on your GPU. This will depend on your
set up, so you're mostly on your own for this step. If you're using AWS, try the
set up, so you're mostly on your own for this step. If you're using AWS, try the
[NVidia AMI](https://aws.amazon.com/marketplace/pp/B00FYCDDTE). It made things pretty easy.
Once you've installed the dependencies, you can run a small preliminary test of
@ -94,5 +99,5 @@ you how run-time usage will eventually look.
## Getting updates
We should have the blog post explaining the model ready before the end of the week. To get
notified when it's published, you can either the follow me on [Twitter](https://twitter.com/honnibal),
notified when it's published, you can either the follow me on [Twitter](https://twitter.com/honnibal),
or subscribe to our [mailing list](http://eepurl.com/ckUpQ5).