Back to Llama Index

PromptLayer Handler

docs/examples/observability/PromptLayerHandler.ipynb

0.14.211.7 KB
Original Source

<a href="https://colab.research.google.com/github/run-llama/llama_index/blob/main/docs/examples/observability/LlamaDebugHandler.ipynb" target="_parent"></a>

PromptLayer Handler

PromptLayer is an LLMOps tool to help manage prompts, check out the features. Currently we only support OpenAI for this integration.

If you're opening this Notebook on colab, you will probably need to install LlamaIndex 🦙 and PromptLayer.

python
!pip install llama-index
!pip install promptlayer

Configure API keys

python
import os

os.environ["OPENAI_API_KEY"] = "sk-..."
os.environ["PROMPTLAYER_API_KEY"] = "pl_..."

Download Data

python
!mkdir -p 'data/paul_graham/'
!wget 'https://raw.githubusercontent.com/run-llama/llama_index/main/docs/examples/data/paul_graham/paul_graham_essay.txt' -O 'data/paul_graham/paul_graham_essay.txt'
python
from llama_index.core import SimpleDirectoryReader

docs = SimpleDirectoryReader("./data/paul_graham/").load_data()

Callback Manager Setup

python
from llama_index.core import set_global_handler

# pl_tags are optional, to help you organize your prompts and apps
set_global_handler("promptlayer", pl_tags=["paul graham", "essay"])

Trigger the callback with a query

python
from llama_index.core import VectorStoreIndex

index = VectorStoreIndex.from_documents(docs)
query_engine = index.as_query_engine()
python
response = query_engine.query("What did the author do growing up?")

Access promptlayer.com to see stats