From 1dec138e61f41963096772cd096e1a1e07ae2ce9 Mon Sep 17 00:00:00 2001 From: Raphael Mitsch Date: Thu, 5 Oct 2023 08:50:41 +0200 Subject: [PATCH] Update docs w.r.t. PaLM support. (#13018) --- website/docs/api/large-language-models.mdx | 7 +++++++ website/docs/usage/large-language-models.mdx | 5 +++-- 2 files changed, 10 insertions(+), 2 deletions(-) diff --git a/website/docs/api/large-language-models.mdx b/website/docs/api/large-language-models.mdx index 845edaa1a..aac4c5108 100644 --- a/website/docs/api/large-language-models.mdx +++ b/website/docs/api/large-language-models.mdx @@ -1022,6 +1022,7 @@ Currently, these models are provided as part of the core library: | `spacy.Claude-1-3.v1` | Anthropic | `["claude-1.3", "claude-1.3-100k"]` | `"claude-1.3"` | `{}` | | `spacy.Claude-instant-1.v1` | Anthropic | `["claude-instant-1", "claude-instant-1-100k"]` | `"claude-instant-1"` | `{}` | | `spacy.Claude-instant-1-1.v1` | Anthropic | `["claude-instant-1.1", "claude-instant-1.1-100k"]` | `"claude-instant-1.1"` | `{}` | +| `spacy.PaLM.v1` | Google | `["chat-bison-001", "text-bison-001"]` | `"text-bison-001"` | `{temperature=0.0}` | To use these models, make sure that you've [set the relevant API](#api-keys) keys as environment variables. @@ -1052,6 +1053,12 @@ For Anthropic: export ANTHROPIC_API_KEY="..." ``` +For PaLM: + +```shell +export PALM_API_KEY="..." +``` + ### Models via HuggingFace {id="models-hf"} These models all take the same parameters: diff --git a/website/docs/usage/large-language-models.mdx b/website/docs/usage/large-language-models.mdx index 86f44f5ae..35117ef57 100644 --- a/website/docs/usage/large-language-models.mdx +++ b/website/docs/usage/large-language-models.mdx @@ -170,8 +170,8 @@ to be `"databricks/dolly-v2-12b"` for better performance. ### Example 3: Create the component directly in Python {id="example-3"} The `llm` component behaves as any other component does, and there are -[task-specific components](/api/large-language-models#config) defined to -help you hit the ground running with a reasonable built-in task implementation. +[task-specific components](/api/large-language-models#config) defined to help +you hit the ground running with a reasonable built-in task implementation. ```python import spacy @@ -484,6 +484,7 @@ provider's documentation. | [`spacy.Claude-1-0.v1`](/api/large-language-models#models-rest) | Anthropic’s `claude-1.0` model family. | | [`spacy.Claude-1-2.v1`](/api/large-language-models#models-rest) | Anthropic’s `claude-1.2` model family. | | [`spacy.Claude-1-3.v1`](/api/large-language-models#models-rest) | Anthropic’s `claude-1.3` model family. | +| [`spacy.PaLM.v1`](/api/large-language-models#models-rest) | Google’s `PaLM` model family. | | [`spacy.Dolly.v1`](/api/large-language-models#models-hf) | Dolly models through HuggingFace. | | [`spacy.Falcon.v1`](/api/large-language-models#models-hf) | Falcon models through HuggingFace. | | [`spacy.Llama2.v1`](/api/large-language-models#models-hf) | Llama2 models through HuggingFace. |