Back to Llama Index

HoneyHive LlamaIndex Tracer

docs/examples/observability/HoneyHiveLlamaIndexTracer.ipynb

0.14.213.6 KB
Original Source

<a href="https://colab.research.google.com/github/run-llama/llama_index/blob/main/docs/examples/observability/HoneyHiveLlamaIndexTracer.ipynb" target="_parent"></a>

HoneyHive LlamaIndex Tracer

HoneyHive is a platform that helps developers monitor, evaluate and continuously improve their LLM-powered applications.

The HoneyHiveLlamaIndexTracer is integrated with HoneyHive to help developers debug and analyze the execution flow of your LLM pipeline, or to let developers customize feedback on specific trace events to create evaluation or fine-tuning datasets from production.

python
%pip install llama-index-llms-openai
python
import os
from getpass import getpass

if os.getenv("OPENAI_API_KEY") is None:
    os.environ["OPENAI_API_KEY"] = getpass(
        "Paste your OpenAI key from:"
        " https://platform.openai.com/account/api-keys\n"
    )
assert os.getenv("OPENAI_API_KEY", "").startswith(
    "sk-"
), "This doesn't look like a valid OpenAI API key"
print("OpenAI API key configured")
python
import os
from getpass import getpass

if os.getenv("HONEYHIVE_API_KEY") is None:
    os.environ["HONEYHIVE_API_KEY"] = getpass(
        "Paste your HoneyHive key from:"
        " https://app.honeyhive.ai/settings/account\n"
    )
print("HoneyHive API key configured")

If you're opening this Notebook on colab, you will probably need to install LlamaIndex 🦙.

python
!pip install llama-index
python
from llama_index.core.callbacks import CallbackManager
from llama_index.core.callbacks import LlamaDebugHandler
from llama_index.core import (
    VectorStoreIndex,
    SimpleDirectoryReader,
    SimpleKeywordTableIndex,
    StorageContext,
)
from llama_index.core import ComposableGraph
from llama_index.llms.openai import OpenAI
from honeyhive.utils.llamaindex_tracer import HoneyHiveLlamaIndexTracer

Setup LLM

python
from llama_index.core import Settings

Settings.llm = OpenAI(model="gpt-4", temperature=0)

HoneyHive Callback Manager Setup

Option 1: Set Global Evaluation Handler

python
import llama_index.core
from llama_index.core import set_global_handler

set_global_handler(
    "honeyhive",
    project="My LlamaIndex Project",
    name="My LlamaIndex Pipeline",
    api_key=os.environ["HONEYHIVE_API_KEY"],
)
hh_tracer = llama_index.core.global_handler

Option 2: Manually Configure Callback Handler

Also configure a debugger handler for extra notebook visibility.

python
llama_debug = LlamaDebugHandler(print_trace_on_end=True)

hh_tracer = HoneyHiveLlamaIndexTracer(
    project="My LlamaIndex Project",
    name="My LlamaIndex Pipeline",
    api_key=os.environ["HONEYHIVE_API_KEY"],
)

callback_manager = CallbackManager([llama_debug, hh_tracer])

Settings.callback_manager = callback_manager

1. Indexing

Download Data

python
!mkdir -p 'data/paul_graham/'
!wget 'https://raw.githubusercontent.com/run-llama/llama_index/main/docs/examples/data/paul_graham/paul_graham_essay.txt' -O 'data/paul_graham/paul_graham_essay.txt'
python
docs = SimpleDirectoryReader("./data/paul_graham/").load_data()
python
index = VectorStoreIndex.from_documents(docs)

2. Query Over Index

python
query_engine = index.as_query_engine()
response = query_engine.query("What did the author do growing up?")
print(response, sep="\n")

View HoneyHive Traces

When we are done tracing our events we can view them via the HoneyHive platform. Simply login to HoneyHive, go to your My LlamaIndex Project project, click the Data Store tab and view your Sessions.