Back to Opik

Getting started with Observability

apps/opik-documentation/documentation/fern/docs-v2/observability/getting-started.mdx

2.0.22-6605-merge-20655.9 KB
Original Source

Opik makes it easy to add observability to your existing LLM application. The fastest way is to let your coding agent do it — install the Opik skill in Claude Code, Cursor, Codex, or any other coding agent and it will instrument your code for you. If you'd rather stay inside Opik, use Opik Connect to have Ollie set up tracing from the dashboard. You can also add tracing manually with the SDK.

<video src="/img/v2/observability/getting-started.mp4" width="854" height="480" autoPlay muted loop playsInline controls preload="auto" />

Adding observability to your code

<Tabs> <Tab title="AI coding agent"> The fastest way to add observability is to install the Opik skill in your coding agent and let it instrument your code for you. The skill is compatible with Claude Code, Codex, Cursor, OpenCode and any other agent that supports skills.
<Steps>
  <Step title="Install the Opik skill">
    ```bash
    npx skills add comet-ml/opik-skills
    ```
  </Step>
  <Step title="Run the integration">
    Ask your coding agent to instrument your code:
    ```
    Instrument my agent with Opik using the /instrument command.
    ```

    The agent will read your code, pick the right Opik integration, and add tracing.
  </Step>
</Steps>
</Tab> <Tab title="Opik Connect"> Opik Connect links your local repository to Opik so that [Ollie](/tracing/ollie), Opik's built-in AI coding assistant, can inspect your code and add tracing from the dashboard — no local agent setup required.
<Steps>
  <Step title="Install Opik">
    ```bash
    pip install opik
    ```
  </Step>
  <Step title="Set your environment variables">
    <Tabs>
      <Tab title="Opik Cloud">
        ```bash
        export OPIK_API_KEY="<YOUR_API_KEY>"
        export OPIK_WORKSPACE="<YOUR_WORKSPACE>"
        ```

        You can find your API key and workspace name in the [Opik dashboard](https://www.comet.com/opik).
      </Tab>
      <Tab title="Self-hosted">
        ```bash
        export OPIK_URL_OVERRIDE="http://localhost:5173/api"
        ```

        Replace the URL with your Opik instance address if it differs from the default.
      </Tab>
    </Tabs>
  </Step>
  <Step title="Connect your repository">
    Run this command in the repository you want Ollie to work in:

    ```bash
    opik connect --project "<YOUR_PROJECT_NAME>"
    ```

    This creates a local connection between Opik and your machine so Ollie can inspect your
    code and help add tracing.
  </Step>
</Steps>

Once connected, open Opik and Ollie will help you instrument your code and set up tracing.
See the [Ollie documentation](/tracing/ollie) for more details.
</Tab> <Tab title="Manual integration"> Opik has integrations with all the popular Agent frameworks in both Python and TypeScript as well as first-class support for OpenTelemetry:
<CardGroup cols={3}>
  <Card title="LangChain" href="/integrations/langchain" icon={} iconPosition="left"/>
  <Card title="LlamaIndex" href="/integrations/llama_index" icon={} iconPosition="left"/>
  <Card title="Anthropic" href="/integrations/anthropic" icon={} iconPosition="left"/>
  <Card title="AWS Bedrock" href="/integrations/bedrock" icon={} iconPosition="left"/>
  <Card title="Google Gemini" href="/integrations/gemini" icon={} iconPosition="left"/>
  <Card title="CrewAI" href="/integrations/crewai" icon={} iconPosition="left"/>
</CardGroup>

**[View all 30+ integrations →](/integrations/overview)**

If your framework is not listed, you can use the `@track` decorator (Python) or `track` wrapper
(TypeScript) to manually instrument your code:

<CodeBlocks>
  ```python title="Python"
  import opik

  opik.configure()

  @opik.track
  def my_llm_call(user_message):
      # Your LLM call here
      response = call_llm(user_message)
      return response

  @opik.track(name="my-agent")
  def my_agent(user_message):
      context = retrieve_context(user_message)
      response = my_llm_call(user_message)
      return response
  ```

  ```ts title="Typescript"
  import { Opik } from "opik";

  const client = new Opik();

  const myLlmCall = client.track({
    name: "my_llm_call",
    fn: async (userMessage: string) => {
      // Your LLM call here
      const response = await callLlm(userMessage);
      return response;
    },
  });

  const myAgent = client.track({
    name: "my-agent",
    fn: async (userMessage: string) => {
      const context = await retrieveContext(userMessage);
      const response = await myLlmCall(userMessage);
      return response;
    },
  });
  ```
</CodeBlocks>
</Tab> </Tabs>

Viewing your traces

After running your application, traces will appear in the Opik dashboard. Each trace captures the full execution path of a request, including all nested spans, inputs, outputs, and timing information.

<Frame> </Frame>

You can use Ollie to analyze your traces, identify issues in your agent's behavior, and get actionable suggestions for improvement.

Next steps

  • Concepts — Learn about traces, spans, threads, and feedback scores
  • Log traces — In-depth guide on customizing what gets logged
  • Cost tracking — Monitor token usage and spending