Back to Kilocode

Using OpenRouter with Kilo Code | Unified AI API

packages/kilo-docs/pages/ai-providers/openrouter.md

7.2.525.8 KB
Original Source

Using OpenRouter With Kilo Code

OpenRouter is an AI platform that provides access to a wide variety of language models from different providers, all through a single API. This can simplify setup and allow you to easily experiment with different models.

Website: https://openrouter.ai/

Getting an API Key

  1. Sign Up/Sign In: Go to the OpenRouter website. Sign in with your Google or GitHub account.
  2. Get an API Key: Go to the keys page. You should see an API key listed. If not, create a new key.
  3. Copy the Key: Copy the API key.

Configuration in Kilo Code

{% tabs %} {% tab label="VSCode" %}

Open Settings (gear icon) and go to the Providers tab to add OpenRouter and enter your API key.

The extension stores this in your kilo.json config file. You can also edit the config file directly — see the CLI tab for the file format.

{% /tab %} {% tab label="CLI" %}

Set the API key as an environment variable or configure it in your kilo.json config file:

Environment variable:

bash
export OPENROUTER_API_KEY="your-api-key"

Config file (~/.config/kilo/kilo.json or ./kilo.json):

jsonc
{
  "provider": {
    "openrouter": {
      "env": ["OPENROUTER_API_KEY"],
    },
  },
}

Then set your default model:

jsonc
{
  "model": "openrouter/anthropic/claude-sonnet-4-20250514",
}

{% /tab %} {% tab label="VSCode (Legacy)" %}

  1. Open Kilo Code Settings: Click the gear icon ({% codicon name="gear" /%}) in the Kilo Code panel.
  2. Select Provider: Choose "OpenRouter" from the "API Provider" dropdown.
  3. Enter API Key: Paste your OpenRouter API key into the "OpenRouter API Key" field.
  4. Select Model: Choose your desired model from the "Model" dropdown.
  5. (Optional) Custom Base URL: If you need to use a custom base URL for the OpenRouter API, check "Use custom base URL" and enter the URL. Leave this blank for most users.

{% /tab %} {% /tabs %}

Supported Transforms

OpenRouter provides an optional "middle-out" message transform to help with prompts that exceed the maximum context size of a model.

{% tabs %} {% tab label="VSCode & CLI" %}

The middle-out transform is not exposed as a dedicated UI control in the VS Code extension. To enable it, set transforms on the model's options in your kilo.json config file. Anything under options is forwarded verbatim to the OpenRouter AI SDK as providerOptions.openrouter:

jsonc
{
  "provider": {
    "openrouter": {
      "models": {
        "anthropic/claude-sonnet-4-20250514": {
          "options": {
            "transforms": ["middle-out"],
          },
        },
      },
    },
  },
}

{% /tab %} {% tab label="VSCode (Legacy)" %}

The legacy extension does not expose a control for the middle-out transform, and it does not read configuration from kilo.json. To use this option, switch to the new VS Code extension or the CLI.

{% /tab %} {% /tabs %}

Provider Routing

OpenRouter can route to many different inference providers. This can be controlled directly via OpenRouter's provider routing parameter.

{% tabs %} {% tab label="VSCode & CLI" %}

Provider routing is not exposed as dedicated UI controls in the VS Code extension. To configure it, set OpenRouter's provider routing fields under the model's options in your kilo.json config file. Everything under options is forwarded to the OpenRouter AI SDK as providerOptions.openrouter, so any field from the OpenRouter provider routing docs can be used.

jsonc
{
  "provider": {
    "openrouter": {
      "models": {
        "anthropic/claude-sonnet-4-20250514": {
          "options": {
            "provider": {
              "sort": "price", // "price" | "throughput" | "latency"
              "order": ["Anthropic", "Google"], // specific provider preference
              "only": ["Anthropic"], // restrict to listed providers
              "data_collection": "deny", // "allow" | "deny"
              "zdr": true, // zero data retention
            },
          },
        },
      },
    },
  },
}

Omit any field to fall back to your OpenRouter account's default. Fields are passed through without validation — see OpenRouter's docs for the full list of supported values.

{% /tab %} {% tab label="VSCode (Legacy)" %}

The legacy settings UI exposes Provider Routing as two dropdowns under Provider Routing:

Provider Sorting

  • Default provider sorting: use the setting in your OpenRouter account
  • Prefer providers with lower price
  • Prefer providers with higher throughput (i.e. more tokens per seconds)
  • Prefer providers with lower latency (i.e. shorter time to first token)
  • A specific provider preference can also be chosen.

Data Policy

  • No data policy set: use the settings in your OpenRouter account.
  • Allow prompt training: providers that may train on your prompts or completions are allowed. Free models generally require this option to be enabled.
  • Deny prompt training: providers that may train on your prompts or completions are not allowed.
  • Zero data retention: only providers with a strict zero data retention policy are allowed. This option is not recommended, as it will disable many popular providers, such as Anthropic and OpenAI.

{% /tab %} {% /tabs %}

Tips and Notes

  • Model Selection: OpenRouter offers a wide range of models. Experiment to find the best one for your needs.
  • Pricing: OpenRouter charges based on the underlying model's pricing. See the OpenRouter Models page for details.
  • Prompt Caching: Some providers support prompt caching. See the OpenRouter documentation for supported models.