docs/language-models/hosted-models/deepinfra.mdx
To use Open Interpreter with DeepInfra, set the model flag:
interpreter --model deepinfra/<deepinfra-model>
from interpreter import interpreter
interpreter.llm.model = "deepinfra/<deepinfra-model>"
interpreter.chat()
We support the following completion models from DeepInfra:
interpreter --model deepinfra/meta-llama/Llama-2-70b-chat-hf
interpreter --model deepinfra/meta-llama/Llama-2-7b-chat-hf
interpreter --model deepinfra/meta-llama/Llama-2-13b-chat-hf
interpreter --model deepinfra/codellama/CodeLlama-34b-Instruct-hf
interpreter --model deepinfra/mistral/mistral-7b-instruct-v0.1
interpreter --model deepinfra/jondurbin/airoboros-l2-70b-gpt4-1.4.1
interpreter.llm.model = "deepinfra/meta-llama/Llama-2-70b-chat-hf"
interpreter.llm.model = "deepinfra/meta-llama/Llama-2-7b-chat-hf"
interpreter.llm.model = "deepinfra/meta-llama/Llama-2-13b-chat-hf"
interpreter.llm.model = "deepinfra/codellama/CodeLlama-34b-Instruct-hf"
interpreter.llm.model = "deepinfra/mistral-7b-instruct-v0.1"
interpreter.llm.model = "deepinfra/jondurbin/airoboros-l2-70b-gpt4-1.4.1"
Set the following environment variables (click here to learn how) to use these models.
| Environment Variable | Description | Where to Find |
|---|---|---|
DEEPINFRA_API_KEY | DeepInfra API key | DeepInfra Dashboard -> API Keys |