cookbook/liteLLM_clarifai_Demo.ipynb
This notebook walks you through on how to use liteLLM integration of Clarifai and call LLM model from clarifai with response in openAI output format.
#install necessary packages
!pip install litellm
!pip install clarifai
To obtain Clarifai Personal Access Token follow the steps mentioned in the link
## Set Clarifai Credentials
import os
os.environ["CLARIFAI_API_KEY"]= "YOUR_CLARIFAI_PAT" # Clarifai PAT
import litellm
litellm.set_verbose=False
from litellm import completion
messages = [{"role": "user","content": """Write a poem about history?"""}]
response=completion(
model="clarifai/mistralai.completion.mistral-large",
messages=messages,
)
print(f"Mistral large response : {response}")
from litellm import completion
messages = [{"role": "user","content": """Write a poem about history?"""}]
response=completion(
model="clarifai/anthropic.completion.claude-2_1",
messages=messages,
)
print(f"Claude-2.1 response : {response}")
Though clarifai doesn't support streaming, still you can call stream and get the response in standard StreamResponse format of liteLLM
from litellm import completion
messages = [{"role": "user","content": """Write a poem about history?"""}]
response = completion(
model="clarifai/openai.chat-completion.GPT-4",
messages=messages,
stream=True,
api_key = "c75cc032415e45368be331fdd2c06db0")
for chunk in response:
print(chunk)