Back to Litellm

Langchain liteLLM Demo Notebook

cookbook/liteLLM_Langchain_Demo.ipynb

1.84.0-dev.21.4 KB
Original Source

Langchain liteLLM Demo Notebook

Use ChatLiteLLM() to instantly support 50+ LLM models

Langchain Docs: https://python.langchain.com/docs/integrations/chat/litellm

Call all LLM models using the same I/O interface

Example usage

python
ChatLiteLLM(model="gpt-3.5-turbo")
ChatLiteLLM(model="claude-2", temperature=0.3)
ChatLiteLLM(model="command-nightly")
ChatLiteLLM(model="replicate/llama-2-70b-chat:2c1608e18606fad2812020dc541930f2d0495ce32eee50074220b87300bc16e1")
python
!pip install litellm langchain
python
import os
from langchain.chat_models import ChatLiteLLM
from langchain.schema import HumanMessage
python
os.environ['OPENAI_API_KEY'] = ""
chat = ChatLiteLLM(model="gpt-3.5-turbo")
messages = [
    HumanMessage(
        content="what model are you"
    )
]
chat(messages)
python
os.environ['ANTHROPIC_API_KEY'] = ""
chat = ChatLiteLLM(model="claude-2", temperature=0.3)
messages = [
    HumanMessage(
        content="what model are you"
    )
]
chat(messages)
python
os.environ['REPLICATE_API_TOKEN'] = ""
chat = ChatLiteLLM(model="replicate/llama-2-70b-chat:2c1608e18606fad2812020dc541930f2d0495ce32eee50074220b87300bc16e1")
messages = [
    HumanMessage(
        content="what model are you?"
    )
]
chat(messages)
python
os.environ['COHERE_API_KEY'] = ""
chat = ChatLiteLLM(model="command-nightly")
messages = [
    HumanMessage(
        content="what model are you?"
    )
]
chat(messages)