apps/opik-documentation/documentation/fern/docs-v2/observability/overview.mdx
LLM applications are more than a single API call. A typical agent involves retrieval steps, tool calls, prompt assembly, multiple LLM invocations, and post-processing — all wired together in ways that are invisible at runtime. When something goes wrong, you need to see exactly what happened at every step.
Opik gives you full visibility into every request your agent handles. Every LLM call, every tool invocation, every retrieval step is captured as a trace you can inspect, search, and analyze.
<Frame> </Frame>Debugging LLM applications without observability means guessing. You see the final output but not why the model hallucinated, which retrieval step returned irrelevant context, or where latency spiked.
With Opik, you can:
```bash
opik connect --project <YOUR_PROJECT_NAME>
```
```bash
npx skills add comet-ml/opik-skills
```
Then ask your coding agent:
```
Instrument my agent with Opik using the /instrument command.
```
This works with Claude Code, Cursor, Codex, OpenCode, and other coding agents. You can also instrument manually with the SDK:
<CodeBlocks>
```python title="Python"
import opik
@opik.track
def my_agent(user_message):
context = retrieve_context(user_message)
response = call_llm(user_message, context)
return response
```
```ts title="TypeScript"
import { Opik } from "opik";
const client = new Opik();
const myAgent = client.track({
name: "my-agent",
fn: async (userMessage: string) => {
const context = await retrieveContext(userMessage);
return await callLlm(userMessage, context);
},
});
```
</CodeBlocks>
<Frame>
</Frame>
Opik has first-class support for 30+ frameworks in Python, TypeScript, and OpenTelemetry — so you can start capturing traces without changing how your application is built.
<CardGroup cols={3}> <Card title="LangChain" href="/integrations/langchain" icon={} iconPosition="left"/> <Card title="LlamaIndex" href="/integrations/llama_index" icon={} iconPosition="left"/> <Card title="Anthropic" href="/integrations/anthropic" icon={} iconPosition="left"/> <Card title="AWS Bedrock" href="/integrations/bedrock" icon={} iconPosition="left"/> <Card title="Google Gemini" href="/integrations/gemini" icon={} iconPosition="left"/> <Card title="CrewAI" href="/integrations/crewai" icon={} iconPosition="left"/> </CardGroup>