docs/providers/kilocode.md
Kilo Gateway provides a unified API that routes requests to many models behind a single endpoint and API key. It is OpenAI-compatible, so most OpenAI SDKs work by switching the base URL.
| Property | Value |
|---|---|
| Provider | kilocode |
| Auth | KILOCODE_API_KEY |
| API | OpenAI-compatible |
| Base URL | https://api.kilo.ai/api/gateway/ |
Or set the environment variable directly:
```bash
export KILOCODE_API_KEY="<your-kilocode-api-key>" # pragma: allowlist secret
```
The default model is kilocode/kilo/auto, a provider-owned smart-routing
model managed by Kilo Gateway.
OpenClaw dynamically discovers available models from the Kilo Gateway at startup. Use
/models kilocode to see the full list of models available with your account.
Any model available on the gateway can be used with the kilocode/ prefix:
| Model ref | Notes |
|---|---|
kilocode/kilo/auto | Default — smart routing |
kilocode/anthropic/claude-sonnet-4 | Anthropic via Kilo |
kilocode/openai/gpt-5.5 | OpenAI via Kilo |
kilocode/google/gemini-3-pro-preview | Google via Kilo |
| ...and many more | Use /models kilocode to list all |
{
env: { KILOCODE_API_KEY: "<your-kilocode-api-key>" }, // pragma: allowlist secret
agents: {
defaults: {
model: { primary: "kilocode/kilo/auto" },
},
},
}
- Gemini-backed Kilo refs stay on the proxy-Gemini path, so OpenClaw keeps
Gemini thought-signature sanitation there without enabling native Gemini
replay validation or bootstrap rewrites.
- Kilo Gateway uses a Bearer token with your API key under the hood.
<Warning>
`kilocode/kilo/auto` and other proxy-reasoning-unsupported hints skip reasoning
injection. If you need reasoning support, use a concrete model ref such as
`kilocode/anthropic/claude-sonnet-4`.
</Warning>