apps/opik-documentation/documentation/templates/integration_template_openai.md
Use this template for: Integrations that use OpenAI-compatible APIs (like BytePlus, OpenRouter, etc.) and can be integrated using Opik's OpenAI integration.
Requirements:
track_openai() wrapperExamples: BytePlus, OpenRouter, Any OpenAI-compatible API
INTEGRATION_NAME is [INTEGRATION_DESCRIPTION].
This guide explains how to integrate Opik with [INTEGRATION_NAME] using the OpenAI SDK. [INTEGRATION_NAME] provides [SPECIFIC_DESCRIPTION].
First, ensure you have both opik and openai packages installed:
pip install opik openai
You'll also need a [INTEGRATION_NAME] API key which you can get from [INTEGRATION_WEBSITE_URL].
from opik.integrations.openai import track_openai
from openai import OpenAI
# Initialize the OpenAI client with [INTEGRATION_NAME] base URL
client = OpenAI(
base_url="[INTEGRATION_BASE_URL]",
api_key="YOUR_[INTEGRATION_API_KEY_NAME]"
)
client = track_openai(client)
response = client.chat.completions.create(
model="[EXAMPLE_MODEL_NAME]", # You can use any model available on [INTEGRATION_NAME]
messages=[
{"role": "user", "content": "Hello, world!"}
],
temperature=0.7,
max_tokens=100
)
print(response.choices[0].message.content)
[INTEGRATION_NAME] provides access to [MODEL_DESCRIPTION].
You can find the complete list of available models in the [INTEGRATION_NAME] documentation.
[INTEGRATION_NAME] supports the following methods:
client.chat.completions.create(): Works with all modelsclient.beta.chat.completions.parse(): Only compatible with OpenAI modelsFor detailed information about available methods, parameters, and best practices, refer to the [INTEGRATION_NAME] API documentation.
You can combine the tracked client with Opik's @track decorator for comprehensive tracing:
from opik import track
from opik.integrations.openai import track_openai
from openai import OpenAI
client = OpenAI(
base_url="[INTEGRATION_BASE_URL]",
api_key="YOUR_[INTEGRATION_API_KEY_NAME]"
)
client = track_openai(client)
@track
def analyze_data_with_ai(query: str):
"""Analyze data using [INTEGRATION_NAME] AI models."""
response = client.chat.completions.create(
model="[EXAMPLE_MODEL_NAME]",
messages=[
{"role": "user", "content": query}
]
)
return response.choices[0].message.content
# Call the tracked function
result = analyze_data_with_ai("Analyze this business data...")
[INTEGRATION_NAME] supports streaming responses:
response = client.chat.completions.create(
model="[EXAMPLE_MODEL_NAME]",
messages=[
{"role": "user", "content": "Tell me a story about AI"}
],
stream=True
)
for chunk in response:
if chunk.choices[0].delta.content is not None:
print(chunk.choices[0].delta.content, end="")
Make sure to set the following environment variables:
# [INTEGRATION_NAME] Configuration
export [INTEGRATION_API_KEY_NAME]="your-[integration-name]-api-key"
# Opik Configuration
export OPIK_PROJECT_NAME="your-project-name"
export OPIK_WORKSPACE="your-workspace-name"
Once your [INTEGRATION_NAME] calls are logged with Opik, you can view them in the Opik UI. Each API call will create a trace with detailed information including:
Once you have [INTEGRATION_NAME] integrated with Opik, you can:
For more information about using Opik with OpenAI-compatible APIs, see the OpenAI integration guide.