Back to Litellm

LiteLLM Clarifai

cookbook/liteLLM_clarifai_Demo.ipynb

1.84.0-dev.21.7 KB
Original Source

LiteLLM Clarifai

This notebook walks you through on how to use liteLLM integration of Clarifai and call LLM model from clarifai with response in openAI output format.

Pre-Requisites

python
#install necessary packages
!pip install litellm
!pip install clarifai

To obtain Clarifai Personal Access Token follow the steps mentioned in the link

python
## Set Clarifai Credentials
import os
os.environ["CLARIFAI_API_KEY"]= "YOUR_CLARIFAI_PAT" # Clarifai PAT

Mistral-large

python
import litellm

litellm.set_verbose=False
python
from litellm import completion

messages = [{"role": "user","content": """Write a poem about history?"""}]
response=completion(
            model="clarifai/mistralai.completion.mistral-large",
            messages=messages,
        )

print(f"Mistral large response : {response}")

Claude-2.1

python
from litellm import completion

messages = [{"role": "user","content": """Write a poem about history?"""}]
response=completion(
            model="clarifai/anthropic.completion.claude-2_1",
            messages=messages,
        )

print(f"Claude-2.1 response : {response}")

OpenAI GPT-4 (Streaming)

Though clarifai doesn't support streaming, still you can call stream and get the response in standard StreamResponse format of liteLLM

python
from litellm import completion

messages = [{"role": "user","content": """Write a poem about history?"""}]
response = completion(
                model="clarifai/openai.chat-completion.GPT-4",
                messages=messages,
                stream=True,
                api_key = "c75cc032415e45368be331fdd2c06db0")

for chunk in response:
  print(chunk)