apps/opik-documentation/documentation/fern/docs-v2/integrations/openrouter.mdx
This guide explains how to integrate Opik with OpenRouter using the OpenAI SDK. OpenRouter provides a unified API for accessing hundreds of AI models through a single OpenAI-compatible interface.
First, ensure you have both opik and openai packages installed:
pip install opik openai
You'll also need an OpenRouter API key which you can get from OpenRouter.
from opik.integrations.openai import track_openai
from openai import OpenAI
# Initialize the OpenAI client with OpenRouter base URL
client = OpenAI(
base_url="https://openrouter.ai/api/v1",
api_key="YOUR_OPENROUTER_API_KEY"
)
client = track_openai(client)
# Optional headers for OpenRouter leaderboard
headers = {
"HTTP-Referer": "YOUR_SITE_URL", # Optional. Site URL for rankings
"X-Title": "YOUR_SITE_NAME" # Optional. Site title for rankings
}
response = client.chat.completions.create(
model="openai/gpt-4", # You can use any model available on OpenRouter
extra_headers=headers,
messages=[
{"role": "user", "content": "Hello, world!"}
],
temperature=0.7,
max_tokens=100
)
print(response.choices[0].message.content)
OpenRouter provides access to a wide variety of models, including many open source models from different providers.
You can find the complete list of available models in the OpenRouter documentation.
OpenRouter supports the following methods:
client.chat.completions.create(): Works with all modelsclient.beta.chat.completions.parse(): Only compatible with OpenAI modelsFor detailed information about available methods, parameters, and best practices, refer to the OpenRouter API documentation.