apps/opik-documentation/documentation/fern/docs-v2/integrations/beeai.mdx
BeeAI is an agent framework designed to simplify the development of AI agents with a focus on simplicity and performance. It provides a clean API for building agents with built-in support for tool usage, conversation management, and extensible architecture.
BeeAI's primary advantage is its lightweight design that makes it easy to create and deploy AI agents without unnecessary complexity, while maintaining powerful capabilities for production use.
<Frame> </Frame>To use the BeeAI integration with Opik, you will need to have BeeAI and the required OpenTelemetry packages installed:
pip install beeai-framework openinference-instrumentation-beeai "beeai-framework[wikipedia]" opentelemetry-api opentelemetry-sdk opentelemetry-exporter-otlp
Configure your environment variables based on your Opik deployment:
<Tabs> <Tab value="Opik Cloud" title="Opik Cloud"> If you are using Opik Cloud, you will need to set the following environment variables: ```bash wordWrap
export OTEL_EXPORTER_OTLP_ENDPOINT=https://www.comet.com/opik/api/v1/private/otel
export OTEL_EXPORTER_OTLP_HEADERS='Authorization=<your-api-key>,Comet-Workspace=default'
```
<Tip>
To log the traces to a specific project, you can add the
`projectName` parameter to the `OTEL_EXPORTER_OTLP_HEADERS`
environment variable:
```bash wordWrap
export OTEL_EXPORTER_OTLP_HEADERS='Authorization=<your-api-key>,Comet-Workspace=default,projectName=<your-project-name>'
```
You can also update the `Comet-Workspace` parameter to a different
value if you would like to log the data to a different workspace.
</Tip>
</Tab>
<Tab value="Enterprise deployment" title="Enterprise deployment">
If you are using an Enterprise deployment of Opik, you will need to set the following
environment variables:
```bash wordWrap
export OTEL_EXPORTER_OTLP_ENDPOINT=https://<comet-deployment-url>/opik/api/v1/private/otel
export OTEL_EXPORTER_OTLP_HEADERS='Authorization=<your-api-key>,Comet-Workspace=default'
```
<Tip>
To log the traces to a specific project, you can add the
`projectName` parameter to the `OTEL_EXPORTER_OTLP_HEADERS`
environment variable:
```bash wordWrap
export OTEL_EXPORTER_OTLP_HEADERS='Authorization=<your-api-key>,Comet-Workspace=default,projectName=<your-project-name>'
```
You can also update the `Comet-Workspace` parameter to a different
value if you would like to log the data to a different workspace.
</Tip>
</Tab>
<Tab value="Self-hosted instance" title="Self-hosted instance">
If you are self-hosting Opik, you will need to set the following environment
variables:
```bash
export OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:5173/api/v1/private/otel
```
<Tip>
To log the traces to a specific project, you can add the `projectName`
parameter to the `OTEL_EXPORTER_OTLP_HEADERS` environment variable:
```bash
export OTEL_EXPORTER_OTLP_HEADERS='projectName=<your-project-name>'
```
</Tip>
</Tab>
Set up OpenTelemetry instrumentation for BeeAI:
from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from openinference.instrumentation.beeai import (
BeeAIInstrumentor,
) # or SemanticKernelInstrumentor
# Configure the OTLP exporter for Opik
otlp_exporter = OTLPSpanExporter()
# Set up the tracer provider
trace.set_tracer_provider(TracerProvider())
trace.get_tracer_provider().add_span_processor(
BatchSpanProcessor(otlp_exporter) # OTLP for sending to Opik
)
# Instrument your framework
BeeAIInstrumentor().instrument() # or SemanticKernelInstrumentor().instrument()
import asyncio
from beeai_framework.agents.react import ReActAgent
from beeai_framework.agents.types import AgentExecutionConfig
from beeai_framework.backend.chat import ChatModel
from beeai_framework.backend.types import ChatModelParameters
from beeai_framework.memory import TokenMemory
from beeai_framework.tools.search.wikipedia import WikipediaTool
from beeai_framework.tools.weather.openmeteo import OpenMeteoTool
# Initialize the language model
llm = ChatModel.from_name(
"openai:gpt-4o-mini", # or "ollama:granite3.3:8b" for local Ollama
ChatModelParameters(temperature=0.7),
)
# Create tools for the agent
tools = [
WikipediaTool(),
OpenMeteoTool(),
]
# Create a ReAct agent with memory
agent = ReActAgent(llm=llm, tools=tools, memory=TokenMemory(llm))
# Run the agent
async def main():
response = await agent.run(
prompt="I'm planning a trip to Barcelona, Spain. Can you research key attractions and landmarks I should visit, and also tell me what the current weather conditions are like there?",
execution=AgentExecutionConfig(
max_retries_per_step=3, total_max_retries=10, max_iterations=5
),
)
print("Agent Response:", response.result.text)
return response
# Run the example
if __name__ == "__main__":
asyncio.run(main())
If you have any questions or suggestions for improving the BeeAI integration, please open an issue on our GitHub repository.