Back to Mem0

Kuzu as Graph Memory

examples/graph-db-demo/kuzu-example.ipynb

2.0.13.5 KB
Original Source

Kuzu as Graph Memory

Prerequisites

Install Mem0 with Graph Memory support

To use Mem0 with Graph Memory support, install it using pip:

bash
pip install "mem0ai[graph]"

This command installs Mem0 along with the necessary dependencies for graph functionality.

Kuzu setup

Kuzu comes embedded into the Python package that gets installed with the above command. There is no extra setup required. Just pick an empty directory where Kuzu should persist its database.

Configuration

Do all the imports and configure OpenAI (enter your OpenAI API key):

python
from mem0 import Memory
from openai import OpenAI

import os

os.environ["OPENAI_API_KEY"] = ""
openai_client = OpenAI()

Set up configuration to use the embedder model and Neo4j as a graph store:

python
config = {
    "embedder": {
        "provider": "openai",
        "config": {"model": "text-embedding-3-large", "embedding_dims": 1536},
    },
    "graph_store": {
        "provider": "kuzu",
        "config": {
            "db": ":memory:",
        },
    },
}
memory = Memory.from_config(config_dict=config)
python
def print_added_memories(results):
    print("::: Saved the following memories:")
    print(" embeddings:")
    for r in results['results']:
        print("    ",r)
    print(" relations:")
    for k,v in results['relations'].items():
        print("    ",k)
        for e in v:
            print("      ",e)

Store memories

Create memories:

python
user = "myuser"

messages = [
    {"role": "user", "content": "I'm planning to watch a movie tonight. Any recommendations?"},
    {"role": "assistant", "content": "How about a thriller movies? They can be quite engaging."},
    {"role": "user", "content": "I'm not a big fan of thriller movies but I love sci-fi movies."},
    {"role": "assistant", "content": "Got it! I'll avoid thriller recommendations and suggest sci-fi movies in the future."}
]

Store memories in Kuzu:

python
results = memory.add(messages, user_id=user, metadata={"category": "movie_recommendations"})
print_added_memories(results)

Search memories

python
for result in memory.search("what does alice love?", user_id=user)["results"]:
    print(result["memory"], result["score"])

Chatbot

python
def chat_with_memories(message: str, user_id: str = user) -> str:
    # Retrieve relevant memories
    relevant_memories = memory.search(query=message, user_id=user_id, limit=3)
    memories_str = "\n".join(f"- {entry['memory']}" for entry in relevant_memories["results"])
    print("::: Using memories:")
    print(memories_str)

    # Generate Assistant response
    system_prompt = f"You are a helpful AI. Answer the question based on query and memories.\nUser Memories:\n{memories_str}"
    messages = [{"role": "system", "content": system_prompt}, {"role": "user", "content": message}]
    response = openai_client.chat.completions.create(model="gpt-4.1-nano-2025-04-14", messages=messages)
    assistant_response = response.choices[0].message.content

    # Create new memories from the conversation
    messages.append({"role": "assistant", "content": assistant_response})
    results = memory.add(messages, user_id=user_id)
    print_added_memories(results)

    return assistant_response
python
print("Chat with AI (type 'exit' to quit)")
while True:
    user_input = input(">>> You: ").strip()
    if user_input.lower() == 'exit':
        print("Goodbye!")
        break
    print(f"<<< AI response:\n{chat_with_memories(user_input)}")