Back to Continue

How to Configure Anthropic Claude Models with Continue

docs/customize/model-providers/top-level/anthropic.mdx

1.5.452.0 KB
Original Source
<Tip> **Discover Anthropic models [here](https://continue.dev/anthropic)** </Tip> <Info> Get an API key from the [Anthropic Console](https://console.anthropic.com/account/keys) </Info>

Configuration

<Tabs> <Tab title="YAML"> ```yaml title="config.yaml" name: My Config version: 0.0.1 schema: v1

models: - name: <MODEL_NAME> provider: anthropic model: <MODEL_ID> apiKey: <YOUR_ANTHROPIC_API_KEY>

</Tab>
<Tab title="JSON (Deprecated)">
```json title="config.json"
{
  "models": [
    {
      "title": "<MODEL_NAME>",
      "provider": "anthropic",
      "model": "<MODEL_ID>",
      "apiKey": "<YOUR_ANTHROPIC_API_KEY>"
    }
  ]
}
</Tab> </Tabs> <Info> **Check out a more advanced configuration [here](https://continue.dev/anthropic/claude-sonnet-4-6?view=config)** </Info>

How to Enable Prompt Caching with Claude

Anthropic supports prompt caching with Claude, which allows Claude models to cache system messages and conversation history between requests to improve performance and reduce costs.

To enable caching of the system message and the turn-by-turn conversation, update your model configuration as follows:

<Tabs> <Tab title="YAML"> ```yaml title="config.yaml" name: My Config version: 0.0.1 schema: v1

models: - name: <MODEL_NAME> provider: anthropic model: <MODEL_ID> apiKey: <YOUR_ANTHROPIC_API_KEY> roles: - chat defaultCompletionOptions: promptCaching: true

</Tab>
<Tab title="JSON (Deprecated)">
```json title="config.json"
{
  "models": [
    {
      "cacheBehavior": {
        "cacheSystemMessage": true,
        "cacheConversation": true
      },
      "title": "<MODEL_NAME>",
      "provider": "anthropic",
      "model": "<MODEL_ID>",
      "defaultCompletionOptions": {
        "promptCaching": true
      },
      "apiKey": "<YOUR_ANTHROPIC_API_KEY>"
    }
  ]
}
</Tab> </Tabs>