Back to Litellm

from https://app.lunary.ai/

cookbook/logging_observability/LiteLLM_Lunary.ipynb

1.84.0-dev.21.2 KB
Original Source

Use LiteLLM with Langfuse

https://docs.litellm.ai/docs/observability/langfuse_integration

Install Dependencies

python
%pip install litellm lunary

Set Env Variables

python
import litellm
from litellm import completion
import os

# from https://app.lunary.ai/
os.environ["LUNARY_PUBLIC_KEY"] = ""


# LLM provider keys
# You can use any of the litellm supported providers: https://docs.litellm.ai/docs/providers
os.environ['OPENAI_API_KEY'] = ""

Set Lunary as a callback for sending data

OpenAI completion call

python
# set langfuse as a callback, litellm will send the data to langfuse
litellm.success_callback = ["lunary"]

# openai call
response = completion(
  model="gpt-3.5-turbo",
  messages=[
    {"role": "user", "content": "Hi 👋 - i'm openai"}
  ]
)

print(response)

Using LiteLLM with Lunary Templates

You can use LiteLLM seamlessly with Lunary templates to manage your prompts and completions.

Assuming you have created a template "test-template" with a variable "question", you can use it like this:

python
import lunary
from litellm import completion

template = lunary.render_template("test-template", {"question": "Hello!"})

response = completion(**template)

print(response)