Back to Litellm

set arize as a callback, litellm will send the data to arize

cookbook/logging_observability/LiteLLM_Arize.ipynb

1.84.0-dev.2954 B
Original Source

Use LiteLLM with Arize

https://docs.litellm.ai/docs/observability/arize_integration

This method uses the litellm proxy to send the data to Arize. The callback is set in the litellm config below, instead of using OpenInference tracing.

Install Dependencies

python
!pip install litellm

Set Env Variables

python
import litellm
import os
from getpass import getpass

os.environ["ARIZE_SPACE_KEY"] = getpass("Enter your Arize space key: ")
os.environ["ARIZE_API_KEY"] = getpass("Enter your Arize API key: ")
os.environ['OPENAI_API_KEY']= getpass("Enter your OpenAI API key: ")

Let's run a completion call and see the traces in Arize

python
# set arize as a callback, litellm will send the data to arize
litellm.callbacks = ["arize"]
 
# openai call
response = litellm.completion(
  model="gpt-3.5-turbo",
  messages=[
    {"role": "user", "content": "Hi 👋 - i'm openai"}
  ]
)
print(response.choices[0].message.content)