Back to Zeroclaw

Provider Configuration

docs/book/src/providers/configuration.md

0.7.44.6 KB
Original Source

Provider Configuration

Every model provider is declared under [providers.models.<name>] in ~/.zeroclaw/config.toml. The <name> is your own alias — it's how you reference the provider elsewhere in config (default_model = "claude", fallback_providers = ["claude", "local"], etc.).

Minimum shape

toml
[providers.models.<name>]
kind = "<provider-kind>"   # required — selects the implementation
model = "<model-id>"       # required — passed to the provider

Almost every provider also takes:

toml
api_key = "..."            # or an env-var placeholder
base_url = "https://..."   # for OpenAI-compatible or self-hosted endpoints

Kinds

kindImplementationNotes
anthropiccrates/zeroclaw-providers/src/anthropic.rsAccepts OAuth tokens (sk-ant-oat*) or API keys
openaiopenai.rsGPT, o-series
ollamaollama.rsNative /api/chat. Supports structured output via format
openai-compatiblecompatible.rsOne impl for ~20 providers; set base_url and optionally api_key
bedrockbedrock.rsUses AWS credentials chain (env, IAM role, profile)
geminigemini.rs
gemini-cligemini_cli.rsShells out to gemini CLI; no API key needed
azure-openaiazure_openai.rsTakes base_url + api_version + deployment
copilotcopilot.rsOAuth flow built in
openrouteropenrouter.rsMulti-vendor routing layer
claude-codeclaude_code.rsDelegates to a Claude Code session via MCP
telnyxtelnyx.rsVoice AI via Telnyx
kiloclikilocli.rsLocal KiloCLI inference
reliablereliable.rsFallback-chain wrapper — see Fallback & routing
routerrouter.rsTask-hint router — see Fallback & routing

Credentials

Four ways to supply credentials, in resolution order:

  1. Inline api_key = "..." in the config entry (fine for dev, risky for checked-in configs)
  2. Config-level secrets store — encrypted at ~/.zeroclaw/secrets via a local key file
  3. Provider-specific env varANTHROPIC_API_KEY, ANTHROPIC_OAUTH_TOKEN, OPENAI_API_KEY, OPENROUTER_API_KEY, GROQ_API_KEY, etc.
  4. Generic fallbackZEROCLAW_API_KEY, API_KEY

The onboarding wizard writes credentials to the secrets store by default. Config files you commit should use neither inline keys nor env_passthrough entries that leak user keys.

OAuth and subscription auth

Several providers support OAuth / subscription-style tokens instead of raw API keys:

  • Anthropicsk-ant-oat-* OAuth tokens work anywhere an API key does. No cost if you're on a Pro/Team plan.
  • GitHub Copilot — authenticate via the onboarding wizard (zeroclaw onboard) which handles the OAuth flow. The token is stored in the secrets backend.
  • Gemini CLI — uses the gemini CLI's existing auth.
  • Claude Code — uses your Claude Code login.

Container-friendly overrides

The onboarding wizard detects Docker/Podman/Kubernetes and rewrites localhost to container-appropriate hostnames:

toml
[providers.models.local]
kind = "ollama"
base_url = "http://host.docker.internal:11434"   # was "http://localhost:11434" on host

You can also force this manually at runtime:

bash
ZEROCLAW_OLLAMA_BASE_URL=http://ollama:11434 zeroclaw agent

Per-provider knobs

Beyond the universal fields, some providers accept extras. Highlights:

Ollama

toml
[providers.models.local]
kind = "ollama"
base_url = "http://localhost:11434"
model = "qwen3.6:35b-a3b"
think = false                # disable reasoning mode for faster output
reasoning_effort = "none"    # same intent, passed as a top-level field
options = { temperature = 0, num_ctx = 32768 }

OpenAI-compatible

toml
[providers.models.groq]
kind = "openai-compatible"
base_url = "https://api.groq.com/openai"
model = "llama-3.3-70b-versatile"
api_key = "gsk_..."
# Optional — supplies SSE tool-call streaming hints the endpoint understands
native_tool_streaming = true

Azure OpenAI

toml
[providers.models.azure]
kind = "azure-openai"
base_url = "https://my-resource.openai.azure.com"
deployment = "gpt-4o"
api_version = "2024-10-01-preview"
api_key = "..."

Picking the default

toml
default_provider = "claude"
default_model    = "claude-haiku-4-5-20251001"

Both are read at agent startup. Channels, tools, and SOPs can override per-request.

See also