fix few typos

This commit is contained in:
svlandeg 2020-10-14 15:01:19 +02:00
parent 0aa8851878
commit 478a14a619
1 changed files with 5 additions and 5 deletions

View File

@ -503,7 +503,7 @@ overview of the `TrainablePipe` methods used by
</Infobox> </Infobox>
### Example: Entity elation extraction component {#component-rel} ### Example: Entity relation extraction component {#component-rel}
This section outlines an example use-case of implementing a **novel relation This section outlines an example use-case of implementing a **novel relation
extraction component** from scratch. We'll implement a binary relation extraction component** from scratch. We'll implement a binary relation
@ -618,7 +618,7 @@ we can define our relation model in a config file as such:
# ... # ...
[model.get_candidates] [model.get_candidates]
@misc = "rel_cand_generator.v2" @misc = "rel_cand_generator.v1"
max_length = 20 max_length = 20
[model.create_candidate_tensor] [model.create_candidate_tensor]
@ -687,8 +687,8 @@ Before the model can be used, it needs to be
[initialized](/usage/training#initialization). This function receives a callback [initialized](/usage/training#initialization). This function receives a callback
to access the full **training data set**, or a representative sample. This data to access the full **training data set**, or a representative sample. This data
set can be used to deduce all **relevant labels**. Alternatively, a list of set can be used to deduce all **relevant labels**. Alternatively, a list of
labels can be provided to `initialize`, or you can call the labels can be provided to `initialize`, or you can call
`RelationExtractoradd_label` directly. The number of labels defines the output `RelationExtractor.add_label` directly. The number of labels defines the output
dimensionality of the network, and will be used to do dimensionality of the network, and will be used to do
[shape inference](https://thinc.ai/docs/usage-models#validation) throughout the [shape inference](https://thinc.ai/docs/usage-models#validation) throughout the
layers of the neural network. This is triggered by calling layers of the neural network. This is triggered by calling
@ -729,7 +729,7 @@ and its internal model can be trained and used to make predictions.
During training, the function [`update`](/api/pipe#update) is invoked which During training, the function [`update`](/api/pipe#update) is invoked which
delegates to delegates to
[`Model.begin_update`](https://thinc.ai/docs/api-model#begin_update) and a [`Model.begin_update`](https://thinc.ai/docs/api-model#begin_update) and a
[`get_loss`](/api/pipe#get_loss) function that **calculate the loss** for a [`get_loss`](/api/pipe#get_loss) function that **calculates the loss** for a
batch of examples, as well as the **gradient** of loss that will be used to batch of examples, as well as the **gradient** of loss that will be used to
update the weights of the model layers. Thinc provides several update the weights of the model layers. Thinc provides several
[loss functions](https://thinc.ai/docs/api-loss) that can be used for the [loss functions](https://thinc.ai/docs/api-loss) that can be used for the