docs/examples/chat_engine/chat_engine_best.ipynb
<a href="https://colab.research.google.com/github/run-llama/llama_index/blob/main/docs/examples/chat_engine/chat_engine_best.ipynb" target="_parent"></a>
The default chat engine mode is "best", which uses the "openai" mode if you are using an OpenAI model that supports the latest function calling API, otherwise uses the "react" mode
If you're opening this Notebook on colab, you will probably need to install LlamaIndex 🦙.
%pip install llama-index-llms-anthropic
%pip install llama-index-llms-openai
!pip install llama-index
!mkdir -p 'data/paul_graham/'
!wget 'https://raw.githubusercontent.com/run-llama/llama_index/main/docs/examples/data/paul_graham/paul_graham_essay.txt' -O 'data/paul_graham/paul_graham_essay.txt'
Load data and build index
from llama_index.core import VectorStoreIndex, SimpleDirectoryReader
from llama_index.llms.openai import OpenAI
from llama_index.llms.anthropic import Anthropic
llm = OpenAI(model="gpt-4")
data = SimpleDirectoryReader(input_dir="./data/paul_graham/").load_data()
index = VectorStoreIndex.from_documents(data)
Configure chat engine
chat_engine = index.as_chat_engine(chat_mode="best", llm=llm, verbose=True)
Chat with your data
response = chat_engine.chat(
"What are the first programs Paul Graham tried writing?"
)
print(response)