Back to Llama Index

Demo: Azure Table Storage as a ChatStore

docs/examples/chat_store/AzureChatStoreDemo.ipynb

0.14.212.8 KB
Original Source

Demo: Azure Table Storage as a ChatStore

This guide shows you how to use our AzureChatStore abstraction which automatically persists chat histories to Azure Table Storage or CosmosDB.

<a href="https://colab.research.google.com/drive/1b_0JuVwWSXiLZZjeBAPr-u5_Y9b34Zcp?usp=sharing" target="_parent"></a>

If you're opening this Notebook on colab, you will probably need to install LlamaIndex 🦙.

python
%pip install llama-index
%pip install llama-index-llms-azure-openai
%pip install llama-index-storage-chat-store-azure
python
import nest_asyncio

nest_asyncio.apply()
python
import logging
import sys

logging.basicConfig(stream=sys.stdout, level=logging.INFO)
logging.getLogger().addHandler(logging.StreamHandler(stream=sys.stdout))
logging.getLogger("azure.core.pipeline.policies.http_logging_policy").setLevel(
    logging.WARNING
)
python
from llama_index.llms.azure_openai import AzureOpenAI
from llama_index.core.response.notebook_utils import display_response
from llama_index.core import Settings

Define our models

In staying with the Azure theme, let's define our Azure OpenAI embedding and LLM models.

python
Settings.llm = AzureOpenAI(
    model="gpt-4",
    deployment_name="gpt-4",
    api_key="",
    azure_endpoint="",
    api_version="2024-03-01-preview",
)

We now define an AzureChatStore, memory and SimpleChatEngine to converse and store history in Azure Table Storage.

python
from llama_index.core.chat_engine import SimpleChatEngine
from llama_index.core.memory import ChatMemoryBuffer
from llama_index.storage.chat_store.azure import AzureChatStore

chat_store = AzureChatStore.from_account_and_key(
    account_name="",
    account_key="",
    chat_table_name="FranChat",
    metadata_table_name="FranChatMeta",
    metadata_partition_key="conversation1",
)

memory = ChatMemoryBuffer.from_defaults(
    token_limit=3000,
    chat_store=chat_store,
    chat_store_key="conversation1",
)

chat_engine = SimpleChatEngine(
    memory=memory, llm=Settings.llm, prefix_messages=[]
)

Test out a ChatEngine with memory backed by Azure Table Storage

python
response = chat_engine.chat("Hello, my name is Fran.")
python
display_response(response)
python
response = chat_engine.chat("What's my name again?")
python
display_response(response)

Start a new conversation

python
chat_store.metadata_partition_key = "conversation2"

memory = ChatMemoryBuffer.from_defaults(
    token_limit=3000,
    chat_store=chat_store,
    chat_store_key="conversation2",
)

chat_engine = SimpleChatEngine(
    memory=memory, llm=Settings.llm, prefix_messages=[]
)
python
response = chat_engine.chat("What's in a name?")
python
display_response(response)