apps/opik-documentation/documentation/fern/docs/tracing/integrations/smolagents.mdx
Smolagents is a framework from HuggingFace that allows you to create AI agents with various capabilities.
The framework provides a simple way to build agents that can perform tasks like coding, searching the web, and more with built-in support for multiple tools and LLM providers.
Comet provides a hosted version of the Opik platform, simply create an account and grab your API Key.
You can also run the Opik platform locally, see the installation guide for more information.
To use the Smolagents integration with Opik, you will need to have Smolagents and the required OpenTelemetry packages installed:
pip install --upgrade opik 'smolagents[telemetry,toolkit]' opentelemetry-sdk opentelemetry-exporter-otlp
In order to use Smolagents, you will need to configure your LLM provider API keys. For this example, we'll use OpenAI. You can find or create your API keys in these pages:
You can set them as environment variables:
export OPENAI_API_KEY="YOUR_API_KEY"
Or set them programmatically:
import os
import getpass
if "OPENAI_API_KEY" not in os.environ:
os.environ["OPENAI_API_KEY"] = getpass.getpass("Enter your OpenAI API key: ")
You will need to set the following environment variables to configure OpenTelemetry to send data to Opik:
<Tabs> <Tab value="Opik Cloud" title="Opik Cloud"> If you are using Opik Cloud, you will need to set the following environment variables: ```bash wordWrap
export OTEL_EXPORTER_OTLP_ENDPOINT=https://www.comet.com/opik/api/v1/private/otel
export OTEL_EXPORTER_OTLP_HEADERS='Authorization=<your-api-key>,Comet-Workspace=default'
```
<Tip>
To log the traces to a specific project, you can add the
`projectName` parameter to the `OTEL_EXPORTER_OTLP_HEADERS`
environment variable:
```bash wordWrap
export OTEL_EXPORTER_OTLP_HEADERS='Authorization=<your-api-key>,Comet-Workspace=default,projectName=<your-project-name>'
```
You can also update the `Comet-Workspace` parameter to a different
value if you would like to log the data to a different workspace.
</Tip>
</Tab>
<Tab value="Enterprise deployment" title="Enterprise deployment">
If you are using an Enterprise deployment of Opik, you will need to set the following
environment variables:
```bash wordWrap
export OTEL_EXPORTER_OTLP_ENDPOINT=https://<comet-deployment-url>/opik/api/v1/private/otel
export OTEL_EXPORTER_OTLP_HEADERS='Authorization=<your-api-key>,Comet-Workspace=default'
```
<Tip>
To log the traces to a specific project, you can add the
`projectName` parameter to the `OTEL_EXPORTER_OTLP_HEADERS`
environment variable:
```bash wordWrap
export OTEL_EXPORTER_OTLP_HEADERS='Authorization=<your-api-key>,Comet-Workspace=default,projectName=<your-project-name>'
```
You can also update the `Comet-Workspace` parameter to a different
value if you would like to log the data to a different workspace.
</Tip>
</Tab>
<Tab value="Self-hosted instance" title="Self-hosted instance">
If you are self-hosting Opik, you will need to set the following environment
variables:
```bash
export OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:5173/api/v1/private/otel
```
<Tip>
To log the traces to a specific project, you can add the `projectName`
parameter to the `OTEL_EXPORTER_OTLP_HEADERS` environment variable:
```bash
export OTEL_EXPORTER_OTLP_HEADERS='projectName=<your-project-name>'
```
</Tip>
</Tab>
To track your Smolagents applications, you will need to configure OpenTelemetry to instrument the framework:
# Import telemetry components
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import SimpleSpanProcessor
# Import SmolAgents instrumentation
from openinference.instrumentation.smolagents import SmolagentsInstrumentor
# Import OTLP exporter
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
# Set up the tracer provider with span processor
trace_provider = TracerProvider()
trace_provider.add_span_processor(
SimpleSpanProcessor(OTLPSpanExporter())
)
# Instrument SmolAgents
SmolagentsInstrumentor().instrument(tracer_provider=trace_provider)
# Now use SmolAgents as usual
from smolagents import CodeAgent, WebSearchTool, OpenAIServerModel
model = OpenAIServerModel(model_id="gpt-4o")
agent = CodeAgent(
tools=[WebSearchTool()],
model=model,
stream_outputs=True
)
agent.run("How many seconds would it take for a leopard at full speed to run through Pont des Arts?")
If you would like to see us improve this integration, simply open a new feature request on Github.