Back to Opik

Quickstart

apps/opik-documentation/documentation/fern/docs/quickstart.mdx

2.0.22-6605-merge-206520.3 KB
Original Source

This guide helps you integrate the Opik platform with your existing LLM application. The goal of this guide is to help you log your first LLM calls and chains to the Opik platform.

<Frame> </Frame>

Prerequisites

Before you begin, you'll need to choose how you want to use Opik:

Logging your first LLM calls

Opik makes it easy to integrate with your existing LLM application, here are some of our most popular integrations:

<Tabs> <Tab title="Python SDK" value="python-function-decorator"> If you are using the Python function decorator, you can integrate by:
<Steps>
  <Step>
    Install the Opik Python SDK:

    ```bash
    pip install opik
    ```
  </Step>
  <Step>
    Configure the Opik Python SDK:

    ```bash
    opik configure
    ```
  </Step>
  <Step>
    Wrap your function with the `@track` decorator:

    ```python
    from opik import track

    @track
    def my_function(input: str) -> str:
        return input
    ```

    All calls to the `my_function` will now be logged to Opik. This works well for any function
    even nested ones and is also supported by most integrations (just wrap any parent function
    with the `@track` decorator).
  </Step>
</Steps>
</Tab> <Tab title="TypeScript SDK" value="typescript-sdk"> If you want to use the TypeScript SDK to log traces directly:
<Steps>
  <Step>
    Install the Opik TypeScript SDK:

    ```bash
    npm install opik
    ```
  </Step>
  <Step>
    Configure the Opik TypeScript SDK by running the interactive CLI tool:

    ```bash
    npx opik-ts configure
    ```

    This will detect your project setup, install required dependencies, and help you configure environment variables.
  </Step>
  <Step>
    Log a trace using the Opik client:

    ```typescript
    import { Opik } from "opik";

    const client = new Opik();

    const trace = client.trace({
      name: "My LLM Application",
      input: { prompt: "What is the capital of France?" },
      output: { response: "The capital of France is Paris." },
    });

    trace.end();
    await client.flush();
    ```

    All traces will now be logged to Opik. You can also log spans within traces for more detailed observability.
  </Step>
</Steps>
</Tab> <Tab title="OpenAI (Python)" value="openai-python-sdk"> If you are using the OpenAI Python SDK, you can integrate by:
<Steps>
  <Step>
    Install the Opik Python SDK:

    ```bash
    pip install opik
    ```
  </Step>
  <Step>
    Configure the Opik Python SDK, this will prompt you for your API key if you are using Opik
    Cloud or your Opik server address if you are self-hosting:

    ```bash
    opik configure
    ```
  </Step>
  <Step>
    Wrap your OpenAI client with the `track_openai` function:

    ```python
    from opik.integrations.openai import track_openai
    from openai import OpenAI

    # Wrap your OpenAI client
    client = OpenAI()
    client = track_openai(client)

    # Use the client as normal
    completion = client.chat.completions.create(
        model="gpt-4o",
        messages=[
            {"role": "user", "content": "Hello, how are you?",
            },
        ],
    )
    print(completion.choices[0].message.content)
    ```

    All OpenAI calls made using the `client` will now be logged to Opik. You can combine
    this with the `@track` decorator to log the traces for each step of your agent.

  </Step>
</Steps>
</Tab> <Tab title="OpenAI (TS)" value="openai-ts-sdk"> If you are using the OpenAI TypeScript SDK, you can integrate by:
<Steps>
  <Step>
    Install the Opik TypeScript SDK:

    ```bash
    npm install opik-openai
    ```
  </Step>
  <Step>
    Configure the Opik TypeScript SDK by running the interactive CLI tool:

    ```bash
    npx opik-ts configure
    ```

    This will detect your project setup, install required dependencies, and help you configure environment variables.
  </Step>
  <Step>
    Wrap your OpenAI client with the `trackOpenAI` function:

    ```typescript
    import OpenAI from "openai";
    import { trackOpenAI } from "opik-openai";

    // Initialize the original OpenAI client
    const openai = new OpenAI({
      apiKey: process.env.OPENAI_API_KEY,
    });

    // Wrap the client with Opik tracking
    const trackedOpenAI = trackOpenAI(openai);

    // Use the tracked client just like the original
    const completion = await trackedOpenAI.chat.completions.create({
      model: "gpt-4",
      messages: [{ role: "user", content: "Hello, how can you help me today?" }],
    });
    console.log(completion.choices[0].message.content);

    // Ensure all traces are sent before your app terminates
    await trackedOpenAI.flush();
    ```

    All OpenAI calls made using the `trackedOpenAI` will now be logged to Opik.

  </Step>
</Steps>
</Tab> <Tab title="AI Vercel SDK" value="ai-vercel-sdk"> If you are using the AI Vercel SDK, you can integrate by:
<Steps>
  <Step>
    Install the Opik Vercel integration:

    ```bash
    npm install opik-vercel
    ```
  </Step>
  <Step>
    Configure the Opik AI Vercel SDK by running the interactive CLI tool:

    ```bash
    npx opik-ts configure
    ```

    This will detect your project setup, install required dependencies, and help you configure environment variables.
  </Step>
  <Step>
    Initialize the OpikExporter with your AI SDK:

    ```ts
    import { openai } from "@ai-sdk/openai";
    import { generateText } from "ai";
    import { NodeSDK } from "@opentelemetry/sdk-node";
    import { getNodeAutoInstrumentations } from "@opentelemetry/auto-instrumentations-node";
    import { OpikExporter } from "opik-vercel";

    // Set up OpenTelemetry with Opik
    const sdk = new NodeSDK({
      traceExporter: new OpikExporter(),
      instrumentations: [getNodeAutoInstrumentations()],
    });
    sdk.start();

    // Your AI SDK calls with telemetry enabled
    const result = await generateText({
      model: openai("gpt-4o"),
      prompt: "What is love?",
      experimental_telemetry: { isEnabled: true },
    });

    console.log(result.text);
    ```

    All AI SDK calls with `experimental_telemetry: { isEnabled: true }` will now be logged to Opik.
  </Step>
</Steps>
</Tab> <Tab title="Ollama" value="ollama-python"> If you are using Ollama with Python, you can integrate by:
<Steps>
  <Step>
    Install the Opik Python SDK:

    ```bash
    pip install opik
    ```
  </Step>
  <Step>
    Configure the Opik Python SDK:

    ```bash
    opik configure
    ```
  </Step>
  <Step>
    Integrate Opik with your Ollama calls:

    <Tabs>
      <Tab title="Ollama Python Package">
        Wrap your Ollama calls with the `@track` decorator:

        ```python
        import ollama
        from opik import track

        @track
        def ollama_call(user_message: str):
            response = ollama.chat(
                model='llama3.1',
                messages=[{'role': 'user', 'content': user_message}]
            )
            return response['message']

        # Call your function
        result = ollama_call("Say this is a test")
        print(result)
        ```
      </Tab>
      <Tab title="OpenAI SDK">
        Use Opik's OpenAI integration with Ollama's OpenAI-compatible API:

        ```python
        from openai import OpenAI
        from opik.integrations.openai import track_openai

        # Create an OpenAI client pointing to Ollama
        client = OpenAI(
            base_url='http://localhost:11434/v1/',
            api_key='ollama'  # required but ignored
        )

        # Wrap the client with Opik tracking
        client = track_openai(client)

        # Call the local Ollama model
        response = client.chat.completions.create(
            model='llama3.1',
            messages=[{'role': 'user', 'content': 'Say this is a test'}]
        )
        print(response.choices[0].message.content)
        ```
      </Tab>
      <Tab title="LangChain">
        Use Opik's LangChain integration with Ollama:

        ```python
        from langchain_ollama import ChatOllama
        from opik.integrations.langchain import OpikTracer

        # Create the Opik tracer
        opik_tracer = OpikTracer()

        # Create the Ollama model with Opik tracing
        llm = ChatOllama(
            model="llama3.1",
            temperature=0,
        ).with_config({"callbacks": [opik_tracer]})

        # Call the Ollama model
        messages = [
            ("system", "You are a helpful assistant."),
            ("human", "Say this is a test")
        ]
        response = llm.invoke(messages)
        print(response)
        ```
      </Tab>
    </Tabs>

    All Ollama calls will now be logged to Opik. See the [full Ollama guide](/v1/integrations/ollama) for more advanced usage.
  </Step>
</Steps>
</Tab> <Tab title="ADK" value="adk-python"> If you are using the ADK, you can integrate by:
<Steps>
  <Step>
    Install the Opik SDK:

    ```bash
    pip install opik google-adk
    ```
  </Step>
  <Step>
    Configure the Opik SDK by running the `opik configure` command in your terminal:

    ```bash
    opik configure
    ```
  </Step>
  <Step>
    Wrap your ADK agent with the `OpikTracer`:

    ```python
    from google.adk.agents import Agent
    from opik.integrations.adk import OpikTracer, track_adk_agent_recursive

    # Create your ADK agent
    agent = Agent(
        name="helpful_assistant",
        model="gemini-2.0-flash",
        instruction="You are a helpful assistant that answers user questions."
    )

    # Wrap your ADK agent with the OpikTracer
    opik_tracer = OpikTracer()
    track_adk_agent_recursive(agent, opik_tracer)
    ```

    All ADK agent calls will now be logged to Opik.
  </Step>
</Steps>
</Tab> <Tab title="LangGraph" value="langgraph"> If you are using LangGraph, you can integrate by:
<Steps>
  <Step>
    Install the Opik SDK:

    ```bash
    pip install opik
    ```
  </Step>
  <Step>
    Configure the Opik SDK by running the `opik configure` command in your terminal:

    ```bash
    opik configure
    ```
  </Step>
  <Step>
    Track your LangGraph graph with `track_langgraph`:

    ```python
    from opik.integrations.langchain import OpikTracer, track_langgraph

    # Create your LangGraph graph
    graph = ...
    app = graph.compile(...)

    # Create OpikTracer and track the graph once
    # The graph visualization is automatically extracted by track_langgraph
    opik_tracer = OpikTracer()
    app = track_langgraph(app, opik_tracer)

    # Now all invocations are automatically tracked!
    result = app.invoke({"messages": [HumanMessage(content = "How to use LangGraph ?")]})
    ```

    All LangGraph calls will now be logged to Opik. No need to pass callbacks on every invocation!
  </Step>
</Steps>
</Tab> <Tab title="AI Wizard" value="ai-installation"> <div style={{"display": "flex", "flexDirection": "row", "gap": "1rem", "alignItems": "center", "justifyContent": "space-between"}}> <span style={{"& p": {"margin": "0rem"}}}> <p style={{"margin": "0rem", "fontStyle": "italic"}}>Integrate with Opik faster using this pre-built prompt</p> </span> <Button intent="primary" href="cursor:////anysphere.cursor-deeplink/prompt?text=%23+OPIK+Agentic+Onboarding%0A%0A%23%23+Goals%0A%0AYou+must+help+me%3A%0A%0A1.+Integrate+the+Opik+client+with+my+existing+LLM+application%0A2.+Set+up+tracing+for+my+LLM+calls+and+chains%0A%0A%23%23+Rules%0A%0ABefore+you+begin%2C+you+must+understand+and+strictly+adhere+to+these+core+principles%3A%0A%0A1.+Code+Preservation+%26+Integration+Guidelines%3A%0A%0A+++-+Existing+business+logic+must+remain+untouched+and+unmodified%0A+++-+Only+add+Opik-specific+code+%28decorators%2C+imports%2C+handlers%2C+env+vars%29%0A+++-+Integration+must+be+non-invasive+and+backwards+compatible%0A%0A2.+Process+Requirements%3A%0A%0A+++-+Follow+the+workflow+steps+sequentially+without+deviation%0A+++-+Validate+completion+of+each+step+before+proceeding%0A+++-+Request+explicit+approval+for+any+workflow+modifications%0A%0A3.+Documentation+%26+Resources%3A%0A%0A+++-+Reference+official+Opik+documentation+at+https%3A%2F%2Fwww.comet.com%2Fdocs%2Fopik%2Fquickstart.md%0A+++-+Follow+Opik+best+practices+and+recommended+patterns%0A+++-+Maintain+detailed+integration+notes+and+configuration+details%0A%0A4.+Testing+%26+Validation%3A%0A+++-+Verify+Opik+integration+without+impacting+existing+functionality%0A+++-+Validate+tracing+works+correctly+for+all+LLM+interactions%0A+++-+Ensure+proper+error+handling+and+logging%0A%0A%23%23+Integration+Workflow%0A%0A%23%23%23+Step+1%3A+Language+and+Compatibility+Check%0A%0AFirst%2C+analyze+the+codebase+to+identify%3A%0A%0A1.+Primary+programming+language+and+frameworks%0A2.+Existing+LLM+integrations+and+patterns%0A%0ACompatibility+Requirements%3A%0A%0A-+Supported+Languages%3A+Python%2C+JavaScript%2FTypeScript%0A%0AIf+the+codebase+uses+unsupported+languages%3A%0A%0A-+Stop+immediately%0A-+Inform+me+that+the+codebase+is+unsupported+for+AI+integration%0A%0AOnly+proceed+to+Step+2+if%3A%0A%0A-+Language+is+Python+or+JavaScript%2FTypeScript%0A%0A%23%23%23+Step+2%3A+Codebase+Discovery+%26+Entrypoint+Confirmation%0A%0AAfter+verifying+language+compatibility%2C+perform+a+full+codebase+scan+with+the+following+objectives%3A%0A%0A-+LLM+Touchpoints%3A+Locate+all+files+and+functions+that+invoke+or+interface+with+LLMs+or+can+be+a+candidates+for+tracing.%0A-+Entrypoint+Detection%3A+Identify+the+primary+application+entry+point%28s%29+%28e.g.%2C+main+script%2C+API+route%2C+CLI+handler%29.+If+ambiguous%2C+pause+and+request+clarification+on+which+component%28s%29+are+most+important+to+trace+before+proceeding.%0A++%E2%9A%A0%EF%B8%8F+Do+not+proceed+to+Step+3+without+explicit+confirmation+if+the+entrypoint+is+unclear.%0A-+Return+the+LLM+Touchpoints+to+me%0A%0A%23%23%23+Step+3%3A+Discover+Available+Integrations%0A%0AAfter+I+confirm+the+LLM+Touchpoints+and+entry+point%2C+find+the+list+of+supported+integrations+at+https%3A%2F%2Fwww.comet.com%2Fdocs%2Fopik%2Fintegrations%2Foverview.md%0A%0A%23%23%23+Step+4%3A+Deep+Analysis+Confirmed+files+for+LLM+Frameworks+%26+SDKs%0A%0AUsing+the+files+confirmed+in+Step+2%2C+perform+targeted+inspection+to+detect+specific+LLM-related+technologies+in+use%2C+such+as%3A%0ASDKs%3A+openai%2C+anthropic%2C+huggingface%2C+etc.%0AFrameworks%3A+LangChain%2C+LlamaIndex%2C+Haystack%2C+etc.%0A%0A%23%23%23+Step+5%3A+Pre-Implementation+Development+Plan+%28Approval+Required%29%0A%0ADo+not+write+or+modify+code+yet.+You+must+propose+me+a+step-by-step+plan+including%3A%0A%0A-+Opik+packages+to+install%0A-+Files+to+be+modified%0A-+Code+snippets+for+insertion%2C+clearly+scoped+and+annotated%0A-+Where+to+place+Opik+API+keys%2C+with+placeholder+comments+%28Visit+https%3A%2F%2Fcomet.com%2Fopik%2Fyour-workspace-name%2Fget-started+to+copy+your+API+key%29%0A++Wait+for+approval+before+proceeding%21%0A%0A%23%23%23+Step+6%3A+Execute+the+Integration+Plan%0A%0AAfter+approval%3A%0A%0A-+Run+the+package+installation+command+via+terminal+%28pip+install+opik%2C+npm+install+opik%2C+etc.%29.%0A-+Apply+code+modifications+exactly+as+described+in+Step+5.%0A-+Keep+all+additions+minimal+and+non-invasive.%0A++Upon+completion%2C+review+the+changes+made+and+confirm+installation+success.%0A%0A%23%23%23+Step+7%3A+Request+User+Review+and+Wait%0A%0ANotify+me+that+all+integration+steps+are+complete.%0A%22Please+run+the+application+and+verify+if+Opik+is+capturing+traces+as+expected.+Let+me+know+if+you+need+adjustments.%22%0A%0A%23%23%23+Step+8%3A+Debugging+Loop+%28If+Needed%29%0A%0AIf+issues+are+reported%3A%0A%0A1.+Parse+the+error+or+unexpected+behavior+from+feedback.%0A2.+Re-query+the+Opik+docs+using+https%3A%2F%2Fwww.comet.com%2Fdocs%2Fopik%2Fquickstart.md+if+needed.%0A3.+Propose+a+minimal+fix+and+await+approval.%0A4.+Apply+and+revalidate.%0A"> <div style={{"display": "flex", "flexDirection": "row", "gap": "1rem", "alignItems": "center"}}> <svg xmlns="http://www.w3.org/2000/svg" id="Ebene_1" version="1.1" viewBox="0 0 466.73 532.09"> <path style={{"fill": "#edecec"}} class="st0" d="M457.43,125.94L244.42,2.96c-6.84-3.95-15.28-3.95-22.12,0L9.3,125.94c-5.75,3.32-9.3,9.46-9.3,16.11v247.99c0,6.65,3.55,12.79,9.3,16.11l213.01,122.98c6.84,3.95,15.28,3.95,22.12,0l213.01-122.98c5.75-3.32,9.3-9.46,9.3-16.11v-247.99c0-6.65-3.55-12.79-9.3-16.11h-.01ZM444.05,151.99l-205.63,356.16c-1.39,2.4-5.06,1.42-5.06-1.36v-233.21c0-4.66-2.49-8.97-6.53-11.31L24.87,145.67c-2.4-1.39-1.42-5.06,1.36-5.06h411.26c5.84,0,9.49,6.33,6.57,11.39h-.01Z"/> </svg> Open in Cursor </div> </Button> </div>
The pre-built prompt will guide you through the integration process, install the Opik SDK and
instrument your code. It supports both Python and TypeScript codebases, if you are using
another language just let us know and we can help you out.

Once the integration is complete, simply run your application and you will start seeing traces
in your Opik dashboard.
</Tab> <Tab title="All integrations" value="all_integrations"> Opik has **30+ integrations** with popular frameworks and model providers:
<CardGroup cols={3}>
  <Card title="LangChain" href="/v1/integrations/langchain" icon={} iconPosition="left"/>
  <Card title="LlamaIndex" href="/v1/integrations/llama_index" icon={} iconPosition="left"/>
  <Card title="Anthropic" href="/v1/integrations/anthropic" icon={} iconPosition="left"/>
  <Card title="AWS Bedrock" href="/v1/integrations/bedrock" icon={} iconPosition="left"/>
  <Card title="Google Gemini" href="/v1/integrations/gemini" icon={} iconPosition="left"/>
  <Card title="CrewAI" href="/v1/integrations/crewai" icon={} iconPosition="left"/>
</CardGroup>

**[View all 30+ integrations →](/v1/integrations/overview)**
</Tab> </Tabs>

Analyze your traces

After running your application, you will start seeing your traces in your Opik dashboard:

<video src="/img/tracing/quickstart.mp4" width="854" height="480" autoPlay muted loop playsInline preload="auto" />

If you don't see traces appearing, reach out to us on Slack or raise an issue on GitHub and we'll help you troubleshoot.

Next steps

Now that you have logged your first LLM calls and chains to Opik, why not check out:

  1. In depth guide on agent observability: Learn how to customize the data that is logged to Opik and how to log conversations.
  2. Opik Experiments: Opik allows you to automated the evaluation process of your LLM application so that you no longer need to manually review every LLM response.
  3. Opik's evaluation metrics: Opik provides a suite of evaluation metrics (Hallucination, Answer Relevance, Context Recall, etc.) that you can use to score your LLM responses.