docs/language-models/hosted-models/cloudflare.mdx
To use Open Interpreter with the Cloudflare Workers AI API, set the model flag:
interpreter --model cloudflare/<cloudflare-model>
from interpreter import interpreter
interpreter.llm.model = "cloudflare/<cloudflare-model>"
interpreter.chat()
We support the following completion models from Cloudflare Workers AI:
interpreter --model cloudflare/@cf/meta/llama-2-7b-chat-fp16
interpreter --model cloudflare/@cf/meta/llama-2-7b-chat-int8
interpreter --model @cf/mistral/mistral-7b-instruct-v0.1
interpreter --model @hf/thebloke/codellama-7b-instruct-awq
interpreter.llm.model = "cloudflare/@cf/meta/llama-2-7b-chat-fp16"
interpreter.llm.model = "cloudflare/@cf/meta/llama-2-7b-chat-int8"
interpreter.llm.model = "@cf/mistral/mistral-7b-instruct-v0.1"
interpreter.llm.model = "@hf/thebloke/codellama-7b-instruct-awq"
Set the following environment variables (click here to learn how) to use these models.
| Environment Variable | Description | Where to Find |
|---|---|---|
CLOUDFLARE_API_KEY | Cloudflare API key | Cloudflare Profile Page -> API Tokens |
CLOUDFLARE_ACCOUNT_ID | Your Cloudflare account ID | Cloudflare Dashboard -> Grab the Account ID from the url like: https://dash.cloudflare.com/{CLOUDFLARE_ACCOUNT_ID}?account= |