apps/opik-documentation/documentation/fern/docs-v2/integrations/fireworks-ai.mdx
Fireworks AI provides fast inference for popular open-source models, offering high-performance API access to models like Llama, Mistral, and Qwen with optimized inference infrastructure.
This guide explains how to integrate Opik with Fireworks AI using their OpenAI-compatible API endpoints. By using the Opik OpenAI integration, you can easily track and evaluate your Fireworks AI API calls within your Opik projects as Opik will automatically log the input prompt, model used, token usage, and response generated.
First, you'll need a Comet.com account to use Opik. If you don't have one, you can sign up for free.
Install the required packages:
pip install opik openai
Configure Opik to send traces to your Comet project:
import opik
opik.configure(
project_name="your-project-name",
workspace="your-workspace-name",
)
Set your Fireworks AI API key as an environment variable:
export FIREWORKS_API_KEY="your-fireworks-api-key"
You can obtain a Fireworks AI API key from the Fireworks AI dashboard.
from opik.integrations.openai import track_openai
from openai import OpenAI
# Create your OpenAI client
client = OpenAI(
api_key="your-fireworks-api-key",
base_url="https://api.fireworks.ai/inference/v1"
)
# Wrap the client with Opik tracking
tracked_client = track_openai(client, project_name="your-project-name")
# Make a chat completion call
response = tracked_client.chat.completions.create(
model="accounts/fireworks/models/llama-v3p1-8b-instruct",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "What are the benefits of using fast inference platforms?"}
]
)
print(response.choices[0].message.content)
You can combine the tracked client with Opik's @track decorator for more comprehensive tracing:
from opik import track
@track
def analyze_text_with_fireworks(text: str):
response = tracked_client.chat.completions.create(
model="accounts/fireworks/models/llama-v3p1-8b-instruct",
messages=[
{"role": "user", "content": f"Analyze this text: {text}"}
],
)
return response.choices[0].message.content
# Use the function
result = analyze_text_with_fireworks("Open source AI models are becoming increasingly powerful.")
Fireworks AI supports streaming responses, which are also tracked by Opik:
response = tracked_client.chat.completions.create(
model="accounts/fireworks/models/llama-v3p1-8b-instruct",
messages=[
{"role": "user", "content": "Explain quantum computing in simple terms."}
],
stream=True
)
for chunk in response:
if chunk.choices[0].delta.content is not None:
print(chunk.choices[0].delta.content, end="")
Fireworks AI provides fast inference for a wide range of popular open-source models. Some of the popular models available include:
accounts/fireworks/models/llama-v3p1-8b-instruct, accounts/fireworks/models/llama-v3-70b-instructaccounts/fireworks/models/mixtral-8x7b-instruct-hf, accounts/fireworks/models/mistral-7b-instruct-v0.1accounts/fireworks/models/qwen-72b-chat, accounts/fireworks/models/qwen-14b-chataccounts/fireworks/models/starcoder-16b, accounts/fireworks/models/codellama-34b-instruct-hfFor the complete list of available models, visit the Fireworks AI model catalog.
Once your Fireworks AI calls are logged with Opik, you can view detailed traces in your Opik dashboard. Each API call will create a trace that includes:
API Key Issues: Ensure your FIREWORKS_API_KEY environment variable is set correctly and has sufficient credits.
Model Name Format: Fireworks AI models use the format accounts/fireworks/models/{model-name}. Make sure you're using the correct model identifier.
Rate Limiting: Fireworks AI has rate limits based on your plan. If you encounter rate limiting, consider implementing exponential backoff in your application.
Base URL: The base URL for Fireworks AI is https://api.fireworks.ai/inference/v1. Ensure you're using the correct endpoint.
If you encounter issues with the integration:
Make sure to set the following environment variables:
# Fireworks AI Configuration
export FIREWORKS_API_KEY="your-fireworks-api-key"
# Opik Configuration
export OPIK_PROJECT_NAME="your-project-name"
export OPIK_WORKSPACE="your-workspace-name"