docs/customize/model-providers/top-level/anthropic.mdx
models: - name: <MODEL_NAME> provider: anthropic model: <MODEL_ID> apiKey: <YOUR_ANTHROPIC_API_KEY>
</Tab>
<Tab title="JSON (Deprecated)">
```json title="config.json"
{
"models": [
{
"title": "<MODEL_NAME>",
"provider": "anthropic",
"model": "<MODEL_ID>",
"apiKey": "<YOUR_ANTHROPIC_API_KEY>"
}
]
}
Anthropic supports prompt caching with Claude, which allows Claude models to cache system messages and conversation history between requests to improve performance and reduce costs.
To enable caching of the system message and the turn-by-turn conversation, update your model configuration as follows:
<Tabs> <Tab title="YAML"> ```yaml title="config.yaml" name: My Config version: 0.0.1 schema: v1models: - name: <MODEL_NAME> provider: anthropic model: <MODEL_ID> apiKey: <YOUR_ANTHROPIC_API_KEY> roles: - chat defaultCompletionOptions: promptCaching: true
</Tab>
<Tab title="JSON (Deprecated)">
```json title="config.json"
{
"models": [
{
"cacheBehavior": {
"cacheSystemMessage": true,
"cacheConversation": true
},
"title": "<MODEL_NAME>",
"provider": "anthropic",
"model": "<MODEL_ID>",
"defaultCompletionOptions": {
"promptCaching": true
},
"apiKey": "<YOUR_ANTHROPIC_API_KEY>"
}
]
}