docs/language-models/hosted-models/anyscale.mdx
To use Open Interpreter with a model from Anyscale, set the model flag:
interpreter --model anyscale/<model-name>
from interpreter import interpreter
# Set the model to use from Anyscale:
interpreter.llm.model = "anyscale/<model-name>"
interpreter.chat()
We support the following completion models from Anyscale:
interpreter --model anyscale/meta-llama/Llama-2-7b-chat-hf
interpreter --model anyscale/meta-llama/Llama-2-13b-chat-hf
interpreter --model anyscale/meta-llama/Llama-2-70b-chat-hf
interpreter --model anyscale/mistralai/Mistral-7B-Instruct-v0.1
interpreter --model anyscale/codellama/CodeLlama-34b-Instruct-hf
interpreter.llm.model = "anyscale/meta-llama/Llama-2-7b-chat-hf"
interpreter.llm.model = "anyscale/meta-llama/Llama-2-13b-chat-hf"
interpreter.llm.model = "anyscale/meta-llama/Llama-2-70b-chat-hf"
interpreter.llm.model = "anyscale/mistralai/Mistral-7B-Instruct-v0.1"
interpreter.llm.model = "anyscale/codellama/CodeLlama-34b-Instruct-hf"
Set the following environment variables (click here to learn how) to use these models.
| Environment Variable | Description | Where to Find |
|---|---|---|
ANYSCALE_API_KEY | The API key for your Anyscale account. | Anyscale Account Settings |