cookbook/LiteLLM_PromptLayer.ipynb
Promptlayer allows you to track requests, responses and prompts
LiteLLM allows you to use any litellm supported model and send data to promptlayer
Getting started docs: https://docs.litellm.ai/docs/observability/promptlayer_integration
!pip install litellm
import litellm
from litellm import completion
import os
os.environ['OPENAI_API_KEY'] = ""
os.environ['REPLICATE_API_TOKEN'] = ""
os.environ['PROMPTLAYER_API_KEY'] = "test-promptlayer-key-123"
# Set Promptlayer as a success callback
litellm.success_callback =['promptlayer']
result = completion(model="gpt-3.5-turbo", messages=[{"role": "user", "content": "gm this is ishaan"}])
print(result)
model="replicate/codellama-13b:1c914d844307b0588599b8393480a3ba917b660c7e9dfae681542b5325f228db"
result = completion(model=model, messages=[{"role": "user", "content": "gm this is ishaan"}])
print(result)