docs/examples/observability/PromptLayerHandler.ipynb
<a href="https://colab.research.google.com/github/run-llama/llama_index/blob/main/docs/examples/observability/LlamaDebugHandler.ipynb" target="_parent"></a>
PromptLayer is an LLMOps tool to help manage prompts, check out the features. Currently we only support OpenAI for this integration.
If you're opening this Notebook on colab, you will probably need to install LlamaIndex 🦙 and PromptLayer.
!pip install llama-index
!pip install promptlayer
import os
os.environ["OPENAI_API_KEY"] = "sk-..."
os.environ["PROMPTLAYER_API_KEY"] = "pl_..."
!mkdir -p 'data/paul_graham/'
!wget 'https://raw.githubusercontent.com/run-llama/llama_index/main/docs/examples/data/paul_graham/paul_graham_essay.txt' -O 'data/paul_graham/paul_graham_essay.txt'
from llama_index.core import SimpleDirectoryReader
docs = SimpleDirectoryReader("./data/paul_graham/").load_data()
from llama_index.core import set_global_handler
# pl_tags are optional, to help you organize your prompts and apps
set_global_handler("promptlayer", pl_tags=["paul graham", "essay"])
from llama_index.core import VectorStoreIndex
index = VectorStoreIndex.from_documents(docs)
query_engine = index.as_query_engine()
response = query_engine.query("What did the author do growing up?")