Back to Open Interpreter

Mistral AI API

docs/language-models/hosted-models/mistral-api.mdx

0.4.21.3 KB
Original Source

To use Open Interpreter with the Mistral API, set the model flag:

<CodeGroup>
bash
interpreter --model mistral/<mistral-model>
python
from interpreter import interpreter

interpreter.llm.model = "mistral/<mistral-model>"
interpreter.chat()
</CodeGroup>

Supported Models

We support the following completion models from the Mistral API:

  • mistral-tiny
  • mistral-small
  • mistral-medium
<CodeGroup>
bash
interpreter --model mistral/mistral-tiny
interpreter --model mistral/mistral-small
interpreter --model mistral/mistral-medium
python
interpreter.llm.model = "mistral/mistral-tiny"
interpreter.llm.model = "mistral/mistral-small"
interpreter.llm.model = "mistral/mistral-medium"
</CodeGroup>

Required Environment Variables

Set the following environment variables (click here to learn how) to use these models.

Environment VariableDescriptionWhere to Find
MISTRAL_API_KEYThe Mistral API key from Mistral API ConsoleMistral API Console