Back to Open Interpreter

Anthropic

docs/language-models/hosted-models/anthropic.mdx

0.4.21.1 KB
Original Source

To use Open Interpreter with a model from Anthropic, set the model flag:

<CodeGroup>
bash
interpreter --model claude-instant-1
python
from interpreter import interpreter

interpreter.llm.model = "claude-instant-1"
interpreter.chat()
</CodeGroup>

Supported Models

We support any model from Anthropic:

<CodeGroup>
bash
interpreter --model claude-instant-1
interpreter --model claude-instant-1.2
interpreter --model claude-2
python
interpreter.llm.model = "claude-instant-1"
interpreter.llm.model = "claude-instant-1.2"
interpreter.llm.model = "claude-2"
</CodeGroup>

Required Environment Variables

Set the following environment variables (click here to learn how) to use these models.

Environment VariableDescriptionWhere to Find
ANTHROPIC_API_KEYThe API key for authenticating to Anthropic's services.Anthropic