Update docs [ci skip]

This commit is contained in:
Ines Montani 2020-09-21 10:55:36 +02:00
parent b9d2b29684
commit 9d32cac736
2 changed files with 14 additions and 6 deletions

View File

@ -921,6 +921,14 @@ package is installed in the same environment as spaCy, it will automatically add
[parallel training](/usage/training#parallel-training) for more details on how
it works under the hood.
<Project id="integrations/ray">
Get started with parallel training using our project template. It trains a
simple model on a Universal Dependencies Treebank and lets you parallelize the
training with Ray.
</Project>
You can integrate [`spacy ray train`](/api/cli#ray-train) into your
`project.yml` just like the regular training command and pass it the config, and
optional output directory or remote storage URL and config overrides if needed.
@ -940,10 +948,6 @@ commands:
- "training/model-best"
```
<!-- TODO: <Project id="integrations/ray">
</Project> -->
---
### Weights & Biases {#wandb} <IntegrationLogo name="wandb" width={175} height="auto" align="right" />

View File

@ -895,9 +895,13 @@ cluster. If it's not set, Ray will run locally.
python -m spacy ray train config.cfg --n-workers 2
```
<!-- TODO: <Project id="integrations/ray">
<Project id="integrations/ray">
</Project> -->
Get started with parallel training using our project template. It trains a
simple model on a Universal Dependencies Treebank and lets you parallelize the
training with Ray.
</Project>
### How parallel training works {#parallel-training-details}