docs/language-models/hosted-models/baseten.mdx
To use Open Interpreter with Baseten, set the model flag:
interpreter --model baseten/<baseten-model>
from interpreter import interpreter
interpreter.llm.model = "baseten/<baseten-model>"
interpreter.chat()
We support the following completion models from Baseten:
interpreter --model baseten/qvv0xeq
interpreter --model baseten/q841o8w
interpreter --model baseten/31dxrj3
interpreter.llm.model = "baseten/qvv0xeq"
interpreter.llm.model = "baseten/q841o8w"
interpreter.llm.model = "baseten/31dxrj3"
Set the following environment variables (click here to learn how) to use these models.
| Environment Variable | Description | Where to Find |
|---|---|---|
BASETEN_API_KEY | Baseten API key | Baseten Dashboard -> Settings -> Account -> API Keys |