packages/kilo-docs/pages/ai-providers/vercel-ai-gateway.md
The AI Gateway provides a unified API to access hundreds of models through a single endpoint. It gives you the ability to set budgets, monitor usage, load-balance requests, and manage fallbacks.
Useful links:
An API key is required for authentication.
The Vercel AI Gateway supports a large and growing number of models. Kilo Code automatically fetches the list of available models from the https://ai-gateway.vercel.sh/v1/models endpoint. Only language models are shown.
The default model is anthropic/claude-sonnet-4 if no model is selected.
Refer to the Vercel AI Gateway Models page for the complete and up-to-date list.
Check the model description in the dropdown for specific capabilities.
{% tabs %} {% tab label="VSCode (Legacy)" %}
{% /tab %} {% tab label="VSCode" %}
Open Settings (gear icon) and go to the Providers tab to add Vercel AI Gateway and enter your API key.
The extension stores this in your kilo.json config file. You can also edit the config file directly — see the CLI tab for the file format.
{% /tab %} {% tab label="CLI" %}
Set the API key as an environment variable or configure it in your kilo.json config file:
Environment variable:
export AI_GATEWAY_API_KEY="your-api-key"
Config file (~/.config/kilo/kilo.json or ./kilo.json):
{
"provider": {
"vercel": {
"env": ["AI_GATEWAY_API_KEY"],
},
},
}
Then set your default model:
{
"model": "vercel/anthropic/claude-sonnet-4",
}
{% /tab %} {% /tabs %}
Vercel AI Gateway supports automatic prompt caching for select models including Anthropic Claude and OpenAI GPT models. This reduces costs by caching frequently used prompts.
0.7 and is configurable per model.