Back to Open Interpreter

Cloudflare Workers AI

docs/language-models/hosted-models/cloudflare.mdx

0.4.21.9 KB
Original Source

To use Open Interpreter with the Cloudflare Workers AI API, set the model flag:

<CodeGroup>
bash
interpreter --model cloudflare/<cloudflare-model>
python
from interpreter import interpreter

interpreter.llm.model = "cloudflare/<cloudflare-model>"
interpreter.chat()
</CodeGroup>

Supported Models

We support the following completion models from Cloudflare Workers AI:

  • Llama-2 7b chat fp16
  • Llama-2 7b chat int8
  • Mistral 7b instruct v0.1
  • CodeLlama 7b instruct awq
<CodeGroup>
bash
interpreter --model cloudflare/@cf/meta/llama-2-7b-chat-fp16
interpreter --model cloudflare/@cf/meta/llama-2-7b-chat-int8
interpreter --model @cf/mistral/mistral-7b-instruct-v0.1
interpreter --model @hf/thebloke/codellama-7b-instruct-awq
python
interpreter.llm.model = "cloudflare/@cf/meta/llama-2-7b-chat-fp16"
interpreter.llm.model = "cloudflare/@cf/meta/llama-2-7b-chat-int8"
interpreter.llm.model = "@cf/mistral/mistral-7b-instruct-v0.1"
interpreter.llm.model = "@hf/thebloke/codellama-7b-instruct-awq"
</CodeGroup>

Required Environment Variables

Set the following environment variables (click here to learn how) to use these models.

Environment VariableDescriptionWhere to Find
CLOUDFLARE_API_KEYCloudflare API keyCloudflare Profile Page -> API Tokens
CLOUDFLARE_ACCOUNT_IDYour Cloudflare account IDCloudflare Dashboard -> Grab the Account ID from the url like: https://dash.cloudflare.com/{CLOUDFLARE_ACCOUNT_ID}?account=