Back to Llama Index

LLM Predictor

docs/examples/llm/llm_predictor.ipynb

0.14.211.3 KB
Original Source

<a href="https://colab.research.google.com/github/run-llama/llama_index/blob/main/docs/examples/llm/llm_predictor.ipynb" target="_parent"></a>

LLM Predictor

If you're opening this Notebook on colab, you will probably need to install LlamaIndex 🦙.

python
%pip install llama-index-llms-openai
%pip install llama-index-llms-langchain
python
!pip install llama-index

LangChain LLM

python
from langchain.chat_models import ChatAnyscale, ChatOpenAI
from llama_index.llms.langchain import LangChainLLM
from llama_index.core import PromptTemplate
python
llm = LangChainLLM(ChatOpenAI())
python
stream = await llm.astream(PromptTemplate("Hi, write a short story"))
python
async for token in stream:
    print(token, end="")
python
## Test with ChatAnyscale
llm = LangChainLLM(ChatAnyscale())
python
stream = llm.stream(
    PromptTemplate("Hi, Which NFL team have most Super Bowl wins")
)
for token in stream:
    print(token, end="")

OpenAI LLM

python
from llama_index.llms.openai import OpenAI
python
llm = OpenAI()
python
stream = await llm.astream("Hi, write a short story")
python
for token in stream:
    print(token, end="")