docs/language-models/hosted-models/petals.mdx
To use Open Interpreter with a model from Petals, set the model flag to begin with petals/:
interpreter --model petals/petals-team/StableBeluga2
from interpreter import interpreter
interpreter.llm.model = "petals/petals-team/StableBeluga2"
interpreter.chat()
Ensure you have petals installed:
pip install git+https://github.com/bigscience-workshop/petals
We support any model on Petals:
<CodeGroup>interpreter --model petals/petals-team/StableBeluga2
interpreter --model petals/huggyllama/llama-65b
interpreter.llm.model = "petals/petals-team/StableBeluga2"
interpreter.llm.model = "petals/huggyllama/llama-65b"
No environment variables are required to use these models.