docs/language-models/hosted-models/mistral-api.mdx
To use Open Interpreter with the Mistral API, set the model flag:
interpreter --model mistral/<mistral-model>
from interpreter import interpreter
interpreter.llm.model = "mistral/<mistral-model>"
interpreter.chat()
We support the following completion models from the Mistral API:
interpreter --model mistral/mistral-tiny
interpreter --model mistral/mistral-small
interpreter --model mistral/mistral-medium
interpreter.llm.model = "mistral/mistral-tiny"
interpreter.llm.model = "mistral/mistral-small"
interpreter.llm.model = "mistral/mistral-medium"
Set the following environment variables (click here to learn how) to use these models.
| Environment Variable | Description | Where to Find |
|---|---|---|
MISTRAL_API_KEY | The Mistral API key from Mistral API Console | Mistral API Console |