Back to Llama Index

Chat Engine - Best Mode

docs/examples/chat_engine/chat_engine_best.ipynb

0.14.211.5 KB
Original Source

<a href="https://colab.research.google.com/github/run-llama/llama_index/blob/main/docs/examples/chat_engine/chat_engine_best.ipynb" target="_parent"></a>

Chat Engine - Best Mode

The default chat engine mode is "best", which uses the "openai" mode if you are using an OpenAI model that supports the latest function calling API, otherwise uses the "react" mode

If you're opening this Notebook on colab, you will probably need to install LlamaIndex 🦙.

python
%pip install llama-index-llms-anthropic
%pip install llama-index-llms-openai
python
!pip install llama-index

Download Data

python
!mkdir -p 'data/paul_graham/'
!wget 'https://raw.githubusercontent.com/run-llama/llama_index/main/docs/examples/data/paul_graham/paul_graham_essay.txt' -O 'data/paul_graham/paul_graham_essay.txt'

Get started in 5 lines of code

Load data and build index

python
from llama_index.core import VectorStoreIndex, SimpleDirectoryReader
from llama_index.llms.openai import OpenAI
from llama_index.llms.anthropic import Anthropic

llm = OpenAI(model="gpt-4")
data = SimpleDirectoryReader(input_dir="./data/paul_graham/").load_data()
index = VectorStoreIndex.from_documents(data)

Configure chat engine

python
chat_engine = index.as_chat_engine(chat_mode="best", llm=llm, verbose=True)

Chat with your data

python
response = chat_engine.chat(
    "What are the first programs Paul Graham tried writing?"
)
python
print(response)