Back to Open Interpreter

Baseten

docs/language-models/hosted-models/baseten.mdx

0.4.21.3 KB
Original Source

To use Open Interpreter with Baseten, set the model flag:

<CodeGroup>
bash
interpreter --model baseten/<baseten-model>
python
from interpreter import interpreter

interpreter.llm.model = "baseten/<baseten-model>"
interpreter.chat()
</CodeGroup>

Supported Models

We support the following completion models from Baseten:

  • Falcon 7b (qvv0xeq)
  • Wizard LM (q841o8w)
  • MPT 7b Base (31dxrj3)
<CodeGroup>
bash
interpreter --model baseten/qvv0xeq
interpreter --model baseten/q841o8w
interpreter --model baseten/31dxrj3
python
interpreter.llm.model = "baseten/qvv0xeq"
interpreter.llm.model = "baseten/q841o8w"
interpreter.llm.model = "baseten/31dxrj3"
</CodeGroup>

Required Environment Variables

Set the following environment variables (click here to learn how) to use these models.

Environment VariableDescriptionWhere to Find
BASETEN_API_KEYBaseten API keyBaseten Dashboard -> Settings -> Account -> API Keys