apps/opik-documentation/documentation/docs/cookbook/gemini.ipynb
Opik integrates with Gemini to provide a simple way to log traces for all Gemini LLM calls. This works for all Gemini models.
Comet provides a hosted version of the Opik platform, simply create an account and grab your API Key.
You can also run the Opik platform locally, see the installation guide for more information.
%pip install --upgrade opik google-genai litellm
import opik
opik.configure(use_local=False)
First, we will set up our GOOGLE_API_KEY keys.
import os
import getpass
if "GOOGLE_API_KEY" not in os.environ:
os.environ["GOOGLE_API_KEY"] = getpass.getpass("Enter your Gemini API key: ")
Now each completion will logs a separate trace to LiteLLM:
from google import genai
from opik import track
from opik.integrations.genai import track_genai
os.environ["OPIK_PROJECT_NAME"] = "gemini-integration-demo"
client = genai.Client()
gemini_client = track_genai(client)
prompt = """
Write a short two sentence story about Opik.
"""
response = gemini_client.models.generate_content(
model="gemini-2.0-flash-001", contents=prompt
)
print(response.text)
The prompt and response messages are automatically logged to Opik and can be viewed in the UI.
track decoratorIf you have multiple steps in your LLM pipeline, you can use the track decorator to log the traces for each step. If Gemini is called within one of these steps, the LLM call with be associated with that corresponding step:
@track
def generate_story(prompt):
response = gemini_client.models.generate_content(
model="gemini-2.0-flash-001", contents=prompt
)
return response.text
@track
def generate_topic():
prompt = "Generate a topic for a story about Opik."
response = gemini_client.models.generate_content(
model="gemini-2.0-flash-001", contents=prompt
)
return response.text
@track
def generate_opik_story():
topic = generate_topic()
story = generate_story(topic)
return story
generate_opik_story()
The trace can now be viewed in the UI: