Back to Pi Mono

Providers

packages/coding-agent/docs/providers.md

0.73.09.5 KB
Original Source

Providers

Pi supports subscription-based providers via OAuth and API key providers via environment variables or auth file. For each provider, pi knows all available models. The list is updated with every pi release.

Table of Contents

Subscriptions

Use /login in interactive mode, then select a provider:

  • ChatGPT Plus/Pro (Codex)
  • Claude Pro/Max
  • GitHub Copilot

Use /logout to clear credentials. Tokens are stored in ~/.pi/agent/auth.json and auto-refresh when expired.

OpenAI Codex

  • Requires ChatGPT Plus or Pro subscription
  • Officially endorsed by OpenAI: Codex for OSS

Claude Pro/Max

Anthropic subscription auth is active for Claude Pro/Max accounts. Third-party harness usage draws from extra usage and is billed per token, not against Claude plan limits.

GitHub Copilot

  • Press Enter for github.com, or enter your GitHub Enterprise Server domain
  • If you get "model not supported", enable it in VS Code: Copilot Chat → model selector → select model → "Enable"

API Keys

Environment Variables or Auth File

Use /login in interactive mode and select a provider to store an API key in auth.json, or set credentials via environment variable:

bash
export ANTHROPIC_API_KEY=sk-ant-...
pi
ProviderEnvironment Variableauth.json key
AnthropicANTHROPIC_API_KEYanthropic
Azure OpenAI ResponsesAZURE_OPENAI_API_KEYazure-openai-responses
OpenAIOPENAI_API_KEYopenai
DeepSeekDEEPSEEK_API_KEYdeepseek
Google GeminiGEMINI_API_KEYgoogle
MistralMISTRAL_API_KEYmistral
GroqGROQ_API_KEYgroq
CerebrasCEREBRAS_API_KEYcerebras
Cloudflare AI GatewayCLOUDFLARE_API_KEY (+ CLOUDFLARE_ACCOUNT_ID, CLOUDFLARE_GATEWAY_ID)cloudflare-ai-gateway
Cloudflare Workers AICLOUDFLARE_API_KEY (+ CLOUDFLARE_ACCOUNT_ID)cloudflare-workers-ai
xAIXAI_API_KEYxai
OpenRouterOPENROUTER_API_KEYopenrouter
Vercel AI GatewayAI_GATEWAY_API_KEYvercel-ai-gateway
ZAIZAI_API_KEYzai
OpenCode ZenOPENCODE_API_KEYopencode
OpenCode GoOPENCODE_API_KEYopencode-go
Hugging FaceHF_TOKENhuggingface
FireworksFIREWORKS_API_KEYfireworks
Kimi For CodingKIMI_API_KEYkimi-coding
MiniMaxMINIMAX_API_KEYminimax
MiniMax (China)MINIMAX_CN_API_KEYminimax-cn
Xiaomi MiMoXIAOMI_API_KEYxiaomi
Xiaomi MiMo Token Plan (China)XIAOMI_TOKEN_PLAN_CN_API_KEYxiaomi-token-plan-cn
Xiaomi MiMo Token Plan (Amsterdam)XIAOMI_TOKEN_PLAN_AMS_API_KEYxiaomi-token-plan-ams
Xiaomi MiMo Token Plan (Singapore)XIAOMI_TOKEN_PLAN_SGP_API_KEYxiaomi-token-plan-sgp

Reference for environment variables and auth.json keys: const envMap in packages/ai/src/env-api-keys.ts.

Auth File

Store credentials in ~/.pi/agent/auth.json:

json
{
  "anthropic": { "type": "api_key", "key": "sk-ant-..." },
  "openai": { "type": "api_key", "key": "sk-..." },
  "deepseek": { "type": "api_key", "key": "sk-..." },
  "google": { "type": "api_key", "key": "..." },
  "opencode": { "type": "api_key", "key": "..." },
  "opencode-go": { "type": "api_key", "key": "..." },
  "xiaomi": { "type": "api_key", "key": "..." },
  "xiaomi-token-plan-cn":  { "type": "api_key", "key": "..." },
  "xiaomi-token-plan-ams": { "type": "api_key", "key": "..." },
  "xiaomi-token-plan-sgp": { "type": "api_key", "key": "..." }
}

The file is created with 0600 permissions (user read/write only). Auth file credentials take priority over environment variables.

Key Resolution

The key field supports three formats:

  • Shell command: "!command" executes and uses stdout (cached for process lifetime)
    json
    { "type": "api_key", "key": "!security find-generic-password -ws 'anthropic'" }
    { "type": "api_key", "key": "!op read 'op://vault/item/credential'" }
    
  • Environment variable: Uses the value of the named variable
    json
    { "type": "api_key", "key": "MY_ANTHROPIC_KEY" }
    
  • Literal value: Used directly
    json
    { "type": "api_key", "key": "sk-ant-..." }
    

OAuth credentials are also stored here after /login and managed automatically.

Cloud Providers

Azure OpenAI

bash
export AZURE_OPENAI_API_KEY=...
export AZURE_OPENAI_BASE_URL=https://your-resource.openai.azure.com
# also supported: https://your-resource.cognitiveservices.azure.com
# root endpoints are auto-normalized to /openai/v1
# or use resource name instead of base URL
export AZURE_OPENAI_RESOURCE_NAME=your-resource

# Optional
export AZURE_OPENAI_API_VERSION=2024-02-01
export AZURE_OPENAI_DEPLOYMENT_NAME_MAP=gpt-4=my-gpt4,gpt-4o=my-gpt4o

Amazon Bedrock

bash
# Option 1: AWS Profile
export AWS_PROFILE=your-profile

# Option 2: IAM Keys
export AWS_ACCESS_KEY_ID=AKIA...
export AWS_SECRET_ACCESS_KEY=...

# Option 3: Bearer Token
export AWS_BEARER_TOKEN_BEDROCK=...

# Optional region (defaults to us-east-1)
export AWS_REGION=us-west-2

Also supports ECS task roles (AWS_CONTAINER_CREDENTIALS_*) and IRSA (AWS_WEB_IDENTITY_TOKEN_FILE).

bash
pi --provider amazon-bedrock --model us.anthropic.claude-sonnet-4-20250514-v1:0

Prompt caching is enabled automatically for Claude models whose ID contains a recognizable model name (base models and system-defined inference profiles). For application inference profiles (whose ARNs don't contain the model name), set AWS_BEDROCK_FORCE_CACHE=1 to enable cache points:

bash
export AWS_BEDROCK_FORCE_CACHE=1
pi --provider amazon-bedrock --model arn:aws:bedrock:us-east-1:123456789012:application-inference-profile/abc123

If you are connecting to a Bedrock API proxy, the following environment variables can be used:

bash
# Set the URL for the Bedrock proxy (standard AWS SDK env var)
export AWS_ENDPOINT_URL_BEDROCK_RUNTIME=https://my.corp.proxy/bedrock

# Set if your proxy does not require authentication
export AWS_BEDROCK_SKIP_AUTH=1

# Set if your proxy only supports HTTP/1.1
export AWS_BEDROCK_FORCE_HTTP1=1

Cloudflare AI Gateway

CLOUDFLARE_API_KEY can be set via /login. The account ID and gateway slug must be set as environment variables.

bash
export CLOUDFLARE_API_KEY=...           # or use /login
export CLOUDFLARE_ACCOUNT_ID=...
export CLOUDFLARE_GATEWAY_ID=...        # create at dash.cloudflare.com → AI → AI Gateway
pi --provider cloudflare-ai-gateway --model "claude-sonnet-4-5"

Routes to OpenAI, Anthropic, and Workers AI through Cloudflare AI Gateway. Workers AI uses the Unified API (/compat) and prefixed model IDs (workers-ai/@cf/...). OpenAI uses the OpenAI passthrough route (/openai) with native OpenAI model IDs such as gpt-5.1. Anthropic uses the Anthropic passthrough route (/anthropic) with native Anthropic model IDs such as claude-sonnet-4-5.

AI Gateway authentication uses CLOUDFLARE_API_KEY as cf-aig-authorization. Upstream authentication can be one of:

ModeRequest authUpstream auth
Workers AICloudflare token onlyCloudflare-native
Unified billingCloudflare token onlyCloudflare handles upstream auth and deducts credits
Stored BYOKCloudflare token onlyCloudflare injects provider keys stored in the AI Gateway dashboard
Inline BYOKCloudflare token plus upstream Authorization headerThe request supplies the upstream provider key

For normal pi usage, prefer unified billing or stored BYOK. Inline BYOK requires configuring an additional upstream Authorization header for the Cloudflare AI Gateway provider, for example via a models.json provider/model override.

Cloudflare Workers AI

CLOUDFLARE_API_KEY can be set via /login. CLOUDFLARE_ACCOUNT_ID must be set as an environment variable.

bash
export CLOUDFLARE_API_KEY=...           # or use /login
export CLOUDFLARE_ACCOUNT_ID=...
pi --provider cloudflare-workers-ai --model "@cf/moonshotai/kimi-k2.6"

Pi automatically sets x-session-affinity for prefix caching discounts.

Google Vertex AI

Uses Application Default Credentials:

bash
gcloud auth application-default login
export GOOGLE_CLOUD_PROJECT=your-project
export GOOGLE_CLOUD_LOCATION=us-central1

Or set GOOGLE_APPLICATION_CREDENTIALS to a service account key file.

Custom Providers

Via models.json: Add Ollama, LM Studio, vLLM, or any provider that speaks a supported API (OpenAI Completions, OpenAI Responses, Anthropic Messages, Google Generative AI). See models.md.

Via extensions: For providers that need custom API implementations or OAuth flows, create an extension. See custom-provider.md and examples/extensions/custom-provider-gitlab-duo.

Resolution Order

When resolving credentials for a provider:

  1. CLI --api-key flag
  2. auth.json entry (API key or OAuth token)
  3. Environment variable
  4. Custom provider keys from models.json