cookbook/liteLLM_Getting_Started.ipynb
https://github.com/BerriAI/litellm
liteLLM is package to simplify calling OpenAI, Azure, Llama2, Cohere, Anthropic, Huggingface API Endpoints. LiteLLM manages
!pip install litellm
from litellm import completion
import os
Set keys for the models you want to use below
# Only set keys for the LLMs you want to use
os.environ['OPENAI_API_KEY'] = "" #@param
os.environ["ANTHROPIC_API_KEY"] = "" #@param
os.environ["REPLICATE_API_KEY"] = "" #@param
os.environ["COHERE_API_KEY"] = "" #@param
os.environ["AZURE_API_BASE"] = "" #@param
os.environ["AZURE_API_VERSION"] = "" #@param
os.environ["AZURE_API_KEY"] = "" #@param
completion(model="gpt-3.5-turbo", messages=[{ "content": "what's the weather in SF","role": "user"}])
completion(model="claude-2", messages=[{ "content": "what's the weather in SF","role": "user"}])
model = "replicate/llama-2-70b-chat:2c1608e18606fad2812020dc541930f2d0495ce32eee50074220b87300bc16e1"
completion(model=model, messages=[{ "content": "what's the weather in SF","role": "user"}])
completion(model="command-nightly", messages=[{ "content": "what's the weather in SF","role": "user"}])
For azure openai calls ensure to add the azure/ prefix to model. If your deployment-id is chatgpt-test set model = azure/chatgpt-test
completion(model="azure/chatgpt-v-2", messages=[{ "content": "what's the weather in SF","role": "user"}])