site/docs/providers/cloudflare-gateway.md
Cloudflare AI Gateway is a proxy service that routes requests to AI providers through Cloudflare's infrastructure. It provides:
The cloudflare-gateway provider lets you route your promptfoo evaluations through Cloudflare AI Gateway to any supported AI provider.
cloudflare-gateway:{provider}:{model}
Examples:
cloudflare-gateway:openai:gpt-5.2cloudflare-gateway:anthropic:claude-sonnet-4-5-20250929cloudflare-gateway:groq:llama-3.3-70b-versatileSet your Cloudflare account ID and gateway ID:
export CLOUDFLARE_ACCOUNT_ID=your_account_id_here
export CLOUDFLARE_GATEWAY_ID=your_gateway_id_here
You need API keys for the providers you're routing through:
# For OpenAI
export OPENAI_API_KEY=your_openai_key
# For Anthropic
export ANTHROPIC_API_KEY=your_anthropic_key
# For Groq
export GROQ_API_KEY=your_groq_key
If you've configured BYOK in Cloudflare, you can omit provider API keys entirely. Cloudflare will use the keys stored in your gateway configuration.
providers:
# No OPENAI_API_KEY needed - Cloudflare uses stored key
- id: cloudflare-gateway:openai:gpt-5.2
config:
accountId: '{{env.CLOUDFLARE_ACCOUNT_ID}}'
gatewayId: '{{env.CLOUDFLARE_GATEWAY_ID}}'
cfAigToken: '{{env.CF_AIG_TOKEN}}'
:::note BYOK works best with OpenAI-compatible providers. Anthropic requires an API key because the SDK mandates it. :::
If your gateway has Authenticated Gateway enabled, you must provide the cfAigToken:
export CF_AIG_TOKEN=your_gateway_token_here
prompts:
- 'Answer this question: {{question}}'
providers:
- id: cloudflare-gateway:openai:gpt-5.2
config:
accountId: '{{env.CLOUDFLARE_ACCOUNT_ID}}'
gatewayId: '{{env.CLOUDFLARE_GATEWAY_ID}}'
temperature: 0.7
tests:
- vars:
question: What is the capital of France?
Cloudflare AI Gateway supports routing to these providers:
| Provider | Gateway Name | API Key Environment Variable |
|---|---|---|
| OpenAI | openai | OPENAI_API_KEY |
| Anthropic | anthropic | ANTHROPIC_API_KEY |
| Groq | groq | GROQ_API_KEY |
| Perplexity | perplexity-ai | PERPLEXITY_API_KEY |
| Google AI Studio | google-ai-studio | GOOGLE_API_KEY |
| Mistral | mistral | MISTRAL_API_KEY |
| Cohere | cohere | COHERE_API_KEY |
| Azure OpenAI | azure-openai | AZURE_OPENAI_API_KEY |
| Workers AI | workers-ai | CLOUDFLARE_API_KEY |
| Hugging Face | huggingface | HUGGINGFACE_API_KEY |
| Replicate | replicate | REPLICATE_API_KEY |
| Grok (xAI) | grok | XAI_API_KEY |
:::note AWS Bedrock is not supported through Cloudflare AI Gateway because it requires AWS request signing, which is incompatible with the gateway proxy approach. :::
| Option | Type | Description |
|---|---|---|
accountId | string | Cloudflare account ID |
accountIdEnvar | string | Custom environment variable for account ID (default: CLOUDFLARE_ACCOUNT_ID) |
gatewayId | string | AI Gateway ID |
gatewayIdEnvar | string | Custom environment variable for gateway ID (default: CLOUDFLARE_GATEWAY_ID) |
cfAigToken | string | Optional gateway authentication token |
cfAigTokenEnvar | string | Custom environment variable for gateway token (default: CF_AIG_TOKEN) |
Azure OpenAI requires additional configuration:
| Option | Type | Description |
|---|---|---|
resourceName | string | Azure OpenAI resource name (required) |
deploymentName | string | Azure OpenAI deployment name (required) |
apiVersion | string | Azure API version (default: 2024-12-01-preview) |
providers:
- id: cloudflare-gateway:azure-openai:gpt-4
config:
accountId: '{{env.CLOUDFLARE_ACCOUNT_ID}}'
gatewayId: '{{env.CLOUDFLARE_GATEWAY_ID}}'
resourceName: my-azure-resource
deploymentName: my-gpt4-deployment
apiVersion: 2024-12-01-preview
Workers AI routes requests to Cloudflare's edge-deployed models. The model name is included in the URL path:
providers:
- id: cloudflare-gateway:workers-ai:@cf/meta/llama-3.1-8b-instruct
config:
accountId: '{{env.CLOUDFLARE_ACCOUNT_ID}}'
gatewayId: '{{env.CLOUDFLARE_GATEWAY_ID}}'
All options from the underlying provider are supported. For example, when using cloudflare-gateway:openai:gpt-5.2, you can use any OpenAI provider options.
providers:
- id: cloudflare-gateway:openai:gpt-5.2
config:
accountId: '{{env.CLOUDFLARE_ACCOUNT_ID}}'
gatewayId: '{{env.CLOUDFLARE_GATEWAY_ID}}'
temperature: 0.8
max_tokens: 1000
top_p: 0.9
Compare responses from different providers, all routed through your Cloudflare gateway:
prompts:
- 'Explain {{topic}} in simple terms.'
providers:
- id: cloudflare-gateway:openai:gpt-5.2
config:
accountId: '{{env.CLOUDFLARE_ACCOUNT_ID}}'
gatewayId: '{{env.CLOUDFLARE_GATEWAY_ID}}'
- id: cloudflare-gateway:anthropic:claude-sonnet-4-5-20250929
config:
accountId: '{{env.CLOUDFLARE_ACCOUNT_ID}}'
gatewayId: '{{env.CLOUDFLARE_GATEWAY_ID}}'
- id: cloudflare-gateway:groq:llama-3.3-70b-versatile
config:
accountId: '{{env.CLOUDFLARE_ACCOUNT_ID}}'
gatewayId: '{{env.CLOUDFLARE_GATEWAY_ID}}'
tests:
- vars:
topic: quantum computing
If your AI Gateway requires authentication:
providers:
- id: cloudflare-gateway:openai:gpt-5.2
config:
accountId: '{{env.CLOUDFLARE_ACCOUNT_ID}}'
gatewayId: '{{env.CLOUDFLARE_GATEWAY_ID}}'
cfAigToken: '{{env.CF_AIG_TOKEN}}'
Use custom environment variable names for different projects or environments:
providers:
- id: cloudflare-gateway:openai:gpt-5.2
config:
accountIdEnvar: MY_CF_ACCOUNT
gatewayIdEnvar: MY_CF_GATEWAY
apiKeyEnvar: MY_OPENAI_KEY
The provider constructs the gateway URL in this format:
https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/{provider}
For example, with accountId: abc123 and gatewayId: my-gateway, requests to OpenAI would be routed through:
https://gateway.ai.cloudflare.com/v1/abc123/my-gateway/openai
AI Gateway can cache identical requests, reducing costs when you run the same prompts multiple times (common during development and testing).
View usage across all your AI providers in a single Cloudflare dashboard, making it easier to track costs and usage patterns.
AI Gateway can help manage rate limits by queuing requests, preventing your evaluations from failing due to provider rate limits.
All requests and responses are logged in Cloudflare, making it easier to debug issues and audit AI usage.