Back to Opik

Observability for Pydantic AI with Opik

apps/opik-documentation/documentation/fern/docs/tracing/integrations/pydantic-ai.mdx

2.0.22-6605-merge-20655.7 KB
Original Source

Pydantic AI is a Python agent framework designed to build production grade applications with Generative AI.

Pydantic AI's primary advantage is its integration of Pydantic's type-safe data validation, ensuring structured and reliable responses in AI applications.

Account Setup

Comet provides a hosted version of the Opik platform, simply create an account and grab your API Key.

You can also run the Opik platform locally, see the installation guide for more information.

Getting Started

Installation

To use the Pydantic AI integration with Opik, you will need to have Pydantic AI and logfire installed:

bash
pip install --upgrade pydantic-ai logfire 'logfire[httpx]'

Configuring Pydantic AI

In order to use Pydantic AI, you will need to configure your LLM provider API keys. For this example, we'll use OpenAI. You can find or create your API keys in these pages:

You can set them as environment variables:

bash
export OPENAI_API_KEY="YOUR_API_KEY"

Or set them programmatically:

python
import os
import getpass

if "OPENAI_API_KEY" not in os.environ:
    os.environ["OPENAI_API_KEY"] = getpass.getpass("Enter your OpenAI API key: ")

Configuring OpenTelemetry

You will need to set the following environment variables to make sure the data is logged to Opik:

<Tabs> <Tab value="Opik Cloud" title="Opik Cloud"> If you are using Opik Cloud, you will need to set the following environment variables:
    ```bash
    export OTEL_EXPORTER_OTLP_ENDPOINT=https://www.comet.com/opik/api/v1/private/otel
    export OTEL_EXPORTER_OTLP_HEADERS='Authorization=<your-api-key>,Comet-Workspace=default'
    export OTEL_METRICS_EXPORTER=none
    ```

    <Tip>
        To log the traces to a specific project, you can add the `projectName` parameter to the `OTEL_EXPORTER_OTLP_HEADERS` environment variable:

        ```bash
        export OTEL_EXPORTER_OTLP_HEADERS='Authorization=<your-api-key>,Comet-Workspace=default,projectName=<your-project-name>'
        ```

        You can also update the `Comet-Workspace` parameter to a different value if you would like to log the data
        to a different workspace.
    </Tip>
</Tab>
<Tab value="Enterprise deployment" title="Enterprise deployment">
    If you are using an Enterprise deployment of Opik, you will need to set the following
    environment variables:

    ```bash wordWrap
    export OTEL_EXPORTER_OTLP_ENDPOINT=https://<comet-deployment-url>/opik/api/v1/private/otel
    export OTEL_EXPORTER_OTLP_HEADERS='Authorization=<your-api-key>,Comet-Workspace=default'
    export OTEL_METRICS_EXPORTER=none
    ```

    <Tip>
        To log the traces to a specific project, you can add the
        `projectName` parameter to the `OTEL_EXPORTER_OTLP_HEADERS`
        environment variable:

        ```bash wordWrap
        export OTEL_EXPORTER_OTLP_HEADERS='Authorization=<your-api-key>,Comet-Workspace=default,projectName=<your-project-name>'
        ```

        You can also update the `Comet-Workspace` parameter to a different
        value if you would like to log the data to a different workspace.
    </Tip>
</Tab>
<Tab value="Self-hosted instance" title="Self-hosted instance">

If you are self-hosting Opik, you will need to set the following environment variables:

```bash
export OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:5173/api/v1/private/otel
export OTEL_METRICS_EXPORTER=none
```

<Tip>
    To log the traces to a specific project, you can add the `projectName` parameter to the `OTEL_EXPORTER_OTLP_HEADERS` environment variable:

    ```bash
    export OTEL_EXPORTER_OTLP_HEADERS='projectName=<your-project-name>'
    ```

</Tip>
</Tab>
</Tabs>

Using Opik with Pydantic AI

To track your Pydantic AI agents, you will need to configure logfire as this is the framework used by Pydantic AI to enable tracing.

python
import logfire

logfire.configure(
    send_to_logfire=False,
)
logfire.instrument_pydantic_ai()

Practical Example

Now that everything is configured, you can create and run Pydantic AI agents:

python
import nest_asyncio
from pydantic_ai import Agent

# Enable async support in Jupyter notebooks
nest_asyncio.apply()

# Create a simple agent
agent = Agent(
    "openai:gpt-4o",
    system_prompt="Be concise, reply with one sentence.",
)

# Run the agent
result = agent.run_sync('Where does "hello world" come from?')
print(result.data)
<Frame> </Frame>

Logging threads

You can group multiple agent calls into a conversation thread by setting thread_id as a span attribute on the root Logfire span. Opik's OTEL ingestion recognizes this attribute and maps it directly to the trace's thread_id field:

python
# Logfire wraps OTEL - thread_id becomes a span attribute automatically
with logfire.span("chat_turn", thread_id=thread_id):
    result = agent.run_sync("What is machine learning?")

Further improvements

If you would like to see us improve this integration, simply open a new feature request on Github.