apps/opik-documentation/documentation/fern/docs/opik-university/2_observability/2.1-observability-log-traces.mdx
This hands-on video demonstrates how to implement tracing in Opik, the foundation of LLM observability. You'll learn how traces capture complete interactions between your application and LLMs (inputs, outputs, metadata, and feedback scores), and see step-by-step implementation using OpenAI as an example. Think of traces as the equivalent of logs in traditional software, but specifically designed for LLM applications.
track_openai) for automatic tracing, or the @track decorator for custom function tracing@track decorator creates detailed trace stacks that mirror your function structure, making debugging intuitive