Anthropic
You can get an API key from the Anthropic console.
Chat model
We recommend configuring Claude 4 Sonnet as your chat model.
- YAML
- JSON
models:
- name: Claude 4 Sonnet
provider: anthropic
model: claude-sonnet-4-20250514
apiKey: <YOUR_ANTHROPIC_API_KEY>
{
"models": [
{
"title": "Claude 4 Sonnet",
"provider": "anthropic",
"model": "claude-sonnet-4-latest",
"apiKey": "<YOUR_ANTHROPIC_API_KEY>"
}
]
}
Autocomplete model
Anthropic currently does not offer any autocomplete models.
Click here to see a list of autocomplete model providers.
Embeddings model
Anthropic currently does not offer any embeddings models.
Click here to see a list of embeddings model providers.
Reranking model
Anthropic currently does not offer any reranking models.
Click here to see a list of reranking model providers.
Prompt caching
Anthropic supports prompt caching with Claude, which allows Claude models to cache system messages and conversation history between requests to improve performance and reduce costs.
Prompt caching is generally available for:
- Claude 4 Sonnet
- Claude 3.7 Sonnet
- Claude 3.5 Sonnet
- Claude 3.5 Haiku
To enable caching of the system message and the turn-by-turn conversation, update your model configuration as follows:
- YAML
- JSON
models:
- name: Anthropic
provider: anthropic
model: claude-sonnet-4-20250514
apiKey: <YOUR_ANTHROPIC_API_KEY>
roles:
- chat
defaultCompletionOptions:
promptCaching: true
{
"models": [
{
"cacheBehavior": {
"cacheSystemMessage": true,
"cacheConversation": true
},
"title": "Anthropic",
"provider": "anthropic",
"model": "claude-sonnet-4-latest",
"defaultCompletionOptions": {
"promptCaching": true
},
"apiKey": "<YOUR_ANTHROPIC_API_KEY>"
}
]
}