Back to Litellm

Using LiteLLM with PromptLayer

cookbook/LiteLLM_PromptLayer.ipynb

1.84.0-dev.2624.3 KB
Original Source

Using LiteLLM with PromptLayer

Promptlayer allows you to track requests, responses and prompts

LiteLLM allows you to use any litellm supported model and send data to promptlayer

Getting started docs: https://docs.litellm.ai/docs/observability/promptlayer_integration

python
!pip install litellm
python
import litellm
from litellm import completion
import os
os.environ['OPENAI_API_KEY'] = ""
os.environ['REPLICATE_API_TOKEN'] = ""
os.environ['PROMPTLAYER_API_KEY'] = "test-promptlayer-key-123"

# Set Promptlayer as a success callback
litellm.success_callback =['promptlayer']


Call OpenAI with LiteLLM x PromptLayer

python

result = completion(model="gpt-3.5-turbo", messages=[{"role": "user", "content": "gm this is ishaan"}])
print(result)

Call Replicate-CodeLlama with LiteLLM x PromptLayer

python
model="replicate/codellama-13b:1c914d844307b0588599b8393480a3ba917b660c7e9dfae681542b5325f228db"

result = completion(model=model, messages=[{"role": "user", "content": "gm this is ishaan"}])
print(result)

View Logs on PromptLayer