examples/guides/opentelemetry-otlp/README.md
This example shows how to export traces from the TensorZero Gateway to an external OpenTelemetry-compatible observability system.
Here, we'll export traces to a local instance of Jaeger.
OPENAI_API_KEY).OPENAI_API_KEY environment variable.docker compose upFirst, let's make an inference request to the gateway.
curl -X POST "http://localhost:3000/openai/v1/chat/completions" \
-H "Content-Type: application/json" \
-d '{
"model": "tensorzero::model_name::openai::gpt-4o-mini",
"messages": [
{
"role": "user",
"content": "Write a haiku about TensorZero."
}
]
}'
Then, let's make a feedback request to the gateway.
[!IMPORTANT]
Make sure to replace the
episode_idwith the actual episode ID from the inference request above (not the inference ID!).
curl -X POST "http://localhost:3000/feedback" \
-H "Content-Type: application/json" \
-d '{
"metric_name": "comment",
"episode_id": "00000000-0000-0000-0000-000000000000",
"value": "Great haiku!"
}'
Finally, visit the Jaeger UI at http://localhost:16686 to see the traces.