Back to Llama Index

Observability with OpenLLMetry

docs/examples/observability/OpenLLMetry.ipynb

0.14.212.2 KB
Original Source

<a href="https://colab.research.google.com/github/run-llama/llama_index/blob/main/docs/examples/observability/OpenLLMetry.ipynb" target="_parent"></a>

Observability with OpenLLMetry

OpenLLMetry is an open-source project based on OpenTelemetry for tracing and monitoring LLM applications. It connects to all major observability platforms (like Datadog, Dynatrace, Honeycomb, New Relic and others) and installs in minutes.

If you're opening this Notebook on colab, you will probably need to install LlamaIndex 🦙 and OpenLLMetry.

python
!pip install llama-index
!pip install traceloop-sdk

Configure API keys

Sign-up to Traceloop at app.traceloop.com. Then, go to the API keys page and create a new API key. Copy the key and paste it in the cell below.

If you prefer to use a different observability platform like Datadog, Dynatrace, Honeycomb or others, you can find instructions on how to configure it here.

python
import os

os.environ["OPENAI_API_KEY"] = "sk-..."
os.environ["TRACELOOP_API_KEY"] = "..."

Initialize OpenLLMetry

python
from traceloop.sdk import Traceloop

Traceloop.init()

Download Data

python
!mkdir -p 'data/paul_graham/'
!wget 'https://raw.githubusercontent.com/run-llama/llama_index/main/docs/examples/data/paul_graham/paul_graham_essay.txt' -O 'data/paul_graham/paul_graham_essay.txt'
python
from llama_index.core import SimpleDirectoryReader

docs = SimpleDirectoryReader("./data/paul_graham/").load_data()

Run a query

python
from llama_index.core import VectorStoreIndex

index = VectorStoreIndex.from_documents(docs)
query_engine = index.as_query_engine()
response = query_engine.query("What did the author do growing up?")
print(response)

Go to Traceloop or your favorite platform to view the results