apps/opik-documentation/documentation/fern/docs/tracing/integrations/mastra.mdx
Mastra is the TypeScript agent framework designed to provide the essential primitives for building AI applications. It enables developers to create AI agents with memory and tool-calling capabilities, implement deterministic LLM workflows, and leverage RAG for knowledge integration.
Mastra's primary advantage is its built-in telemetry support that automatically captures agent interactions, LLM calls, and workflow executions, making it easy to monitor and debug AI applications.
<Frame> </Frame>If you don't have a Mastra project yet, you can create one using the Mastra CLI:
npx create-mastra
cd your-mastra-project
Install the necessary dependencies for Mastra observability:
npm install @mastra/observability @mastra/otel
Create or update your .env file with the following variables:
# Opik configuration
OPIK_API_KEY=<your-opik-api-key>
OPIK_WORKSPACE_NAME=<your-workspace>
OPIK_PROJECT_NAME=<your-project-name>
```
</Tab>
<Tab value="Enterprise deployment" title="Enterprise deployment">
```bash wordWrap
# Your LLM API key
OPENAI_API_KEY=<YOUR_OPENAI_API_KEY>
# Opik configuration
OPIK_API_KEY=<your-opik-api-key>
OPIK_WORKSPACE_NAME=<your-workspace>
OPIK_PROJECT_NAME=<your-project-name>
```
</Tab>
<Tab value="Self-hosted instance" title="Self-hosted instance">
```bash
# Your LLM API key
OPENAI_API_KEY=<YOUR_OPENAI_API_KEY>
# Opik configuration
OPIK_PROJECT_NAME=<your-project-name>
```
</Tab>
Create an agent in your project. For example, create a file src/mastra/index.ts:
const OPIK_API_KEY = process.env.OPIK_API_KEY!;
const OPIK_WORKSPACE_NAME = process.env.OPIK_WORKSPACE_NAME!;
const OPIK_PROJECT_NAME = process.env.OPIK_PROJECT_NAME!;
export const chefAgent = new Agent({
name: "chef-agent",
instructions:
"You are Michel, a practical and experienced home chef " +
"You help people cook with whatever ingredients they have available.",
model: openai("gpt-4o-mini"),
});
export const mastra = new Mastra({
agents: { chefAgent },
storage: new LibSQLStore({
url: ":memory:",
}),
logger: new PinoLogger({
name: "Mastra",
level: "info",
}),
observability: new Observability({
configs: {
default: {
serviceName: "chef-agent",
exporters: [
new OtelExporter({
provider: {
custom: {
endpoint: "https://www.comet.com/opik/api/v1/private/otel/v1/traces",
protocol: "http/json",
headers: {
Authorization: OPIK_API_KEY,
"Comet-Workspace": OPIK_WORKSPACE_NAME,
projectName: OPIK_PROJECT_NAME,
},
},
},
}),
],
},
},
}),
});
```
</Tab>
<Tab value="Enterprise deployment" title="Enterprise deployment">
```typescript
import { Mastra } from "@mastra/core/mastra";
import { Observability } from "@mastra/observability";
import { OtelExporter } from "@mastra/otel";
import { PinoLogger } from "@mastra/loggers";
import { LibSQLStore } from "@mastra/libsql";
import { Agent } from "@mastra/core/agent";
import { openai } from "@ai-sdk/openai";
const OPIK_API_KEY = process.env.OPIK_API_KEY!;
const OPIK_WORKSPACE_NAME = process.env.OPIK_WORKSPACE_NAME!;
const OPIK_PROJECT_NAME = process.env.OPIK_PROJECT_NAME!;
export const chefAgent = new Agent({
name: "chef-agent",
instructions:
"You are Michel, a practical and experienced home chef " +
"You help people cook with whatever ingredients they have available.",
model: openai("gpt-4o-mini"),
});
export const mastra = new Mastra({
agents: { chefAgent },
storage: new LibSQLStore({
url: ":memory:",
}),
logger: new PinoLogger({
name: "Mastra",
level: "info",
}),
observability: new Observability({
configs: {
default: {
serviceName: "chef-agent",
exporters: [
new OtelExporter({
provider: {
custom: {
endpoint: "https://<comet-deployment-url>/opik/api/v1/private/otel/v1/traces",
protocol: "http/json",
headers: {
Authorization: OPIK_API_KEY,
"Comet-Workspace": OPIK_WORKSPACE_NAME,
projectName: OPIK_PROJECT_NAME,
},
},
},
}),
],
},
},
}),
});
```
</Tab>
<Tab value="Self-hosted instance" title="Self-hosted instance">
```typescript
import { Mastra } from "@mastra/core/mastra";
import { Observability } from "@mastra/observability";
import { OtelExporter } from "@mastra/otel";
import { PinoLogger } from "@mastra/loggers";
import { LibSQLStore } from "@mastra/libsql";
import { Agent } from "@mastra/core/agent";
import { openai } from "@ai-sdk/openai";
const OPIK_PROJECT_NAME = process.env.OPIK_PROJECT_NAME!;
export const chefAgent = new Agent({
name: "chef-agent",
instructions:
"You are Michel, a practical and experienced home chef " +
"You help people cook with whatever ingredients they have available.",
model: openai("gpt-4o-mini"),
});
export const mastra = new Mastra({
agents: { chefAgent },
storage: new LibSQLStore({
url: ":memory:",
}),
logger: new PinoLogger({
name: "Mastra",
level: "info",
}),
observability: new Observability({
configs: {
default: {
serviceName: "chef-agent",
exporters: [
new OtelExporter({
provider: {
custom: {
endpoint: "http://localhost:5173/api/v1/private/otel/v1/traces",
protocol: "http/json",
headers: {
projectName: OPIK_PROJECT_NAME,
},
},
},
}),
],
},
},
}),
});
```
</Tab>
Start the Mastra development server:
npm run dev
Head over to the developer playground with the provided URL and start chatting with your agent.
With this setup, your Mastra application will automatically trace:
If you have any questions or suggestions for improving the Mastra integration, please open an issue on our GitHub repository.