Back to Open Interpreter

Huggingface

docs/language-models/hosted-models/huggingface.mdx

0.4.21.3 KB
Original Source

To use Open Interpreter with Huggingface models, set the model flag:

<CodeGroup>
bash
interpreter --model huggingface/<huggingface-model>
python
from interpreter import interpreter

interpreter.llm.model = "huggingface/<huggingface-model>"
interpreter.chat()
</CodeGroup>

You may also need to specify your Huggingface api base url: <CodeGroup>

bash
interpreter --api_base <https://my-endpoint.huggingface.cloud>
python
from interpreter import interpreter

interpreter.llm.api_base = "https://my-endpoint.huggingface.cloud"
interpreter.chat()
</CodeGroup>

Supported Models

Open Interpreter should work with almost any text based hugging face model.

Required Environment Variables

Set the following environment variables (click here to learn how) to use these models.

Environment VariableDescriptionWhere to Find
HUGGINGFACE_API_KEYHuggingface account API keyHuggingface -> Settings -> Access Tokens