Back to Open Interpreter

Petals

docs/language-models/hosted-models/petals.mdx

0.4.2952 B
Original Source

To use Open Interpreter with a model from Petals, set the model flag to begin with petals/:

<CodeGroup>
bash
interpreter --model petals/petals-team/StableBeluga2
python
from interpreter import interpreter

interpreter.llm.model = "petals/petals-team/StableBeluga2"
interpreter.chat()
</CodeGroup>

Pre-Requisites

Ensure you have petals installed:

bash
pip install git+https://github.com/bigscience-workshop/petals

Supported Models

We support any model on Petals:

<CodeGroup>
bash
interpreter --model petals/petals-team/StableBeluga2
interpreter --model petals/huggyllama/llama-65b
python
interpreter.llm.model = "petals/petals-team/StableBeluga2"
interpreter.llm.model = "petals/huggyllama/llama-65b"
</CodeGroup>

Required Environment Variables

No environment variables are required to use these models.