Back to Mem0

OpenMemory

openmemory/README.md

2.0.15.6 KB
Original Source

OpenMemory

⚠️ Sunsetting Notice: OpenMemory is being sunset. For local self-hosted memory with a dashboard, please use the Mem0 self-hosted server instead. Get started with cd server && make bootstrap. See the self-hosted docs for configuration details.

OpenMemory is your personal memory layer for LLMs - private, portable, and open-source. Your memories live locally, giving you complete control over your data. Build AI applications with personalized memories while keeping your data secure.

Easy Setup

Prerequisites

  • Docker
  • OpenAI API Key

You can quickly run OpenMemory by running the following command:

bash
curl -sL https://raw.githubusercontent.com/mem0ai/mem0/main/openmemory/run.sh | bash

You should set the OPENAI_API_KEY as a global environment variable:

bash
export OPENAI_API_KEY=your_api_key

You can also set the OPENAI_API_KEY as a parameter to the script:

bash
curl -sL https://raw.githubusercontent.com/mem0ai/mem0/main/openmemory/run.sh | OPENAI_API_KEY=your_api_key bash

Prerequisites

  • Docker and Docker Compose
  • Python 3.9+ (for backend development)
  • Node.js (for frontend development)
  • OpenAI API Key (required for LLM interactions, run cp api/.env.example api/.env then change OPENAI_API_KEY to yours)

Quickstart

1. Set Up Environment Variables

Before running the project, you need to configure environment variables for both the API and the UI.

You can do this in one of the following ways:

  • Manually:
    Create a .env file in each of the following directories:

    • /api/.env
    • /ui/.env
  • Using .env.example files:
    Copy and rename the example files:

    bash
    cp api/.env.example api/.env
    cp ui/.env.example ui/.env
    
  • Using Makefile (if supported):
    Run:

    bash
    make env
    
  • Example /api/.env

env
OPENAI_API_KEY=sk-xxx
USER=<user-id> # The User Id you want to associate the memories with
  • LLM Configuration (optional)

By default, OpenMemory uses OpenAI (gpt-4o-mini) for the LLM and embedder. You can configure a different provider using these environment variables in /api/.env:

VariableDescriptionDefault
LLM_PROVIDERLLM provider (openai, ollama, anthropic, groq, together, deepseek, etc.)openai
LLM_MODELModel name for the LLM providergpt-4o-mini (OpenAI) / llama3.1:latest (Ollama)
LLM_API_KEYAPI key for the LLM providerOPENAI_API_KEY env var
LLM_BASE_URLCustom base URL for the LLM APIProvider default
OLLAMA_BASE_URLOllama-specific base URL (takes precedence over LLM_BASE_URL for Ollama)http://localhost:11434
EMBEDDER_PROVIDEREmbedder provider (defaults to ollama when LLM is Ollama, otherwise openai)openai
EMBEDDER_MODELModel name for the embeddertext-embedding-3-small (OpenAI) / nomic-embed-text (Ollama)
EMBEDDER_API_KEYAPI key for the embedder providerOPENAI_API_KEY env var
EMBEDDER_BASE_URLCustom base URL for the embedder APIProvider default

Example: Using Ollama (fully local)

env
LLM_PROVIDER=ollama
LLM_MODEL=llama3.1:latest
EMBEDDER_PROVIDER=ollama
EMBEDDER_MODEL=nomic-embed-text
OLLAMA_BASE_URL=http://localhost:11434

Example: Using Anthropic

env
LLM_PROVIDER=anthropic
LLM_MODEL=claude-sonnet-4-20250514
LLM_API_KEY=sk-ant-xxx
  • Example /ui/.env

env
NEXT_PUBLIC_API_URL=http://localhost:8765
NEXT_PUBLIC_USER_ID=<user-id> # Same as the user id for environment variable in api

2. Build and Run the Project

You can run the project using the following two commands:

bash
make build # builds the mcp server and ui
make up  # runs openmemory mcp server and ui

After running these commands, you will have:

UI not working on localhost:3000?

If the UI does not start properly on http://localhost:3000, try running it manually:

bash
cd ui
pnpm install
pnpm dev

MCP Client Setup

Use the following one step command to configure OpenMemory Local MCP to a client. The general command format is as follows:

bash
npx @openmemory/install local http://localhost:8765/mcp/<client-name>/sse/<user-id> --client <client-name>

Replace <client-name> with the desired client name and <user-id> with the value specified in your environment variables.

Project Structure

  • api/ - Backend APIs + MCP server
  • ui/ - Frontend React application

Contributing

We are a team of developers passionate about the future of AI and open-source software. With years of experience in both fields, we believe in the power of community-driven development and are excited to build tools that make AI more accessible and personalized.

We welcome all forms of contributions:

  • Bug reports and feature requests
  • Documentation improvements
  • Code contributions
  • Testing and feedback
  • Community support

How to contribute:

  1. Fork the repository
  2. Create your feature branch (git checkout -b openmemory/feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add some amazing feature')
  4. Push to the branch (git push origin openmemory/feature/amazing-feature)
  5. Open a Pull Request

Join us in building the future of AI memory management! Your contributions help make OpenMemory better for everyone.