docs/language-models/hosted-models/huggingface.mdx
To use Open Interpreter with Huggingface models, set the model flag:
interpreter --model huggingface/<huggingface-model>
from interpreter import interpreter
interpreter.llm.model = "huggingface/<huggingface-model>"
interpreter.chat()
You may also need to specify your Huggingface api base url: <CodeGroup>
interpreter --api_base <https://my-endpoint.huggingface.cloud>
from interpreter import interpreter
interpreter.llm.api_base = "https://my-endpoint.huggingface.cloud"
interpreter.chat()
Open Interpreter should work with almost any text based hugging face model.
Set the following environment variables (click here to learn how) to use these models.
| Environment Variable | Description | Where to Find |
|---|---|---|
HUGGINGFACE_API_KEY | Huggingface account API key | Huggingface -> Settings -> Access Tokens |