apps/opik-documentation/documentation/docs/cookbook/agentspec.ipynb
Agent Spec is a portable configuration language for defining agentic systems (agents, tools, and structured workflows).
In this notebook, we will build a simple Agent Spec agent and use Opik's AgentSpecInstrumentor to capture a trace of the agent's tool and LLM execution.
Comet provides a hosted version of the Opik platform, simply create an account and grab your API Key.
You can also run the Opik platform locally, see the installation guide for more information.
%pip install --upgrade opik "pyagentspec[langgraph]" opentelemetry-sdk opentelemetry-instrumentation
import opik
opik.configure(use_local=False)
This demo uses OpenAI as the LLM provider. Set your OpenAI API key as an environment variable:
import os
import getpass
if "OPENAI_API_KEY" not in os.environ:
os.environ["OPENAI_API_KEY"] = getpass.getpass("Enter your OpenAI API key: ")
We'll define a small calculator agent with a couple of tools:
from pyagentspec.agent import Agent
from pyagentspec.llms import OpenAiConfig
from pyagentspec.property import FloatProperty
from pyagentspec.tools import ServerTool
def build_agentspec_agent() -> Agent:
tools = [
ServerTool(
name="sum",
description="Sum two numbers",
inputs=[FloatProperty(title="a"), FloatProperty(title="b")],
outputs=[FloatProperty(title="result")],
),
ServerTool(
name="subtract",
description="Subtract two numbers",
inputs=[FloatProperty(title="a"), FloatProperty(title="b")],
outputs=[FloatProperty(title="result")],
),
]
return Agent(
name="calculator_agent",
description="An agent that provides assistance with tool use.",
llm_config=OpenAiConfig(name="openai-gpt-5-mini", model_id="gpt-5-mini"),
system_prompt=(
"You are a helpful calculator agent.\n"
"Your duty is to compute the result of the given operation using tools, "
"and to output the result.\n"
"It's important that you reply with the result only.\n"
),
tools=tools,
)
Wrap the agent execution in AgentSpecInstrumentor().instrument_context(...) to capture traces in Opik.
Agent traces can include prompts, tool inputs/outputs, and messages. If you need to avoid logging sensitive information, set
mask_sensitive_information=True.
from opik.integrations.agentspec import AgentSpecInstrumentor
from pyagentspec.adapters.langgraph import AgentSpecLoader
agent = build_agentspec_agent()
tool_registry = {
"sum": lambda a, b: a + b,
"subtract": lambda a, b: a - b,
}
langgraph_agent = AgentSpecLoader(tool_registry=tool_registry).load_component(agent)
with AgentSpecInstrumentor().instrument_context(
project_name="agentspec-demo",
mask_sensitive_information=False,
):
messages = []
messages.append({"role": "user", "content": "Compute 13.5 + 2.25 using the sum tool."})
response = langgraph_agent.invoke(
input={"messages": messages},
config={"configurable": {"thread_id": "1"}},
)
agent_answer = response["messages"][-1].content.strip()
print("AGENT >>>", agent_answer)
messages.append({"role": "assistant", "content": agent_answer})
messages.append({"role": "user", "content": "Now compute 10 - 3.5 using the subtract tool."})
response = langgraph_agent.invoke(
input={"messages": messages},
config={"configurable": {"thread_id": "1"}},
)
agent_answer = response["messages"][-1].content.strip()
print("AGENT >>>", agent_answer)
After running the cell above, open Opik and navigate to the agentspec-demo project to inspect the trace tree and debug tool usage and LLM generations.