Back to Llama Index

Joint QA Summary Query Engine

docs/examples/query_engine/JointQASummary.ipynb

0.14.211.8 KB
Original Source

<a href="https://colab.research.google.com/github/run-llama/llama_index/blob/main/docs/examples/query_engine/JointQASummary.ipynb" target="_parent"></a>

Joint QA Summary Query Engine

If you're opening this Notebook on colab, you will probably need to install LlamaIndex 🦙.

python
%pip install llama-index-llms-openai
python
!pip install llama-index
python
import nest_asyncio

nest_asyncio.apply()
python
import logging
import sys

logging.basicConfig(stream=sys.stdout, level=logging.INFO)
logging.getLogger().addHandler(logging.StreamHandler(stream=sys.stdout))

Download Data

python
!mkdir -p 'data/paul_graham/'
!wget 'https://raw.githubusercontent.com/run-llama/llama_index/main/docs/examples/data/paul_graham/paul_graham_essay.txt' -O 'data/paul_graham/paul_graham_essay.txt'

Load Data

python
from llama_index.core import SimpleDirectoryReader

reader = SimpleDirectoryReader("./data/paul_graham/")
documents = reader.load_data()
python
from llama_index.llms.openai import OpenAI

gpt4 = OpenAI(temperature=0, model="gpt-4")

chatgpt = OpenAI(temperature=0, model="gpt-3.5-turbo")
python
from llama_index.core.composability import QASummaryQueryEngineBuilder

# NOTE: can also specify an existing docstore, summary text, qa_text, etc.
query_engine_builder = QASummaryQueryEngineBuilder(
    llm=gpt4,
)
query_engine = query_engine_builder.build_from_documents(documents)
python
response = query_engine.query(
    "Can you give me a summary of the author's life?",
)
python
response = query_engine.query(
    "What did the author do growing up?",
)
python
response = query_engine.query(
    "What did the author do during his time in art school?",
)