documentation/docs/tutorials/langfuse.md
This tutorial covers how to integrate goose with Langfuse to monitor your goose requests and understand how the agent is performing.
Langfuse is an open-source LLM engineering platform that enables teams to collaboratively monitor, evaluate, and debug their LLM applications.
Sign up for Langfuse Cloud or self-host Langfuse Docker Compose to get your Langfuse API keys.
Set the environment variables so that goose (written in Rust) can connect to the Langfuse server.
export LANGFUSE_INIT_PROJECT_PUBLIC_KEY=pk-lf-...
export LANGFUSE_INIT_PROJECT_SECRET_KEY=sk-lf-...
export LANGFUSE_URL=https://cloud.langfuse.com # EU data region πͺπΊ
# https://us.cloud.langfuse.com if you're using the US region πΊπΈ
# https://localhost:3000 if you're self-hosting
Now, you can run goose and monitor your AI requests and actions through Langfuse.
With goose running and the environment variables set, Langfuse will start capturing traces of your goose activities.