openmemory/README.md
⚠️ Sunsetting Notice: OpenMemory is being sunset. For local self-hosted memory with a dashboard, please use the Mem0 self-hosted server instead. Get started with
cd server && make bootstrap. See the self-hosted docs for configuration details.
OpenMemory is your personal memory layer for LLMs - private, portable, and open-source. Your memories live locally, giving you complete control over your data. Build AI applications with personalized memories while keeping your data secure.
You can quickly run OpenMemory by running the following command:
curl -sL https://raw.githubusercontent.com/mem0ai/mem0/main/openmemory/run.sh | bash
You should set the OPENAI_API_KEY as a global environment variable:
export OPENAI_API_KEY=your_api_key
You can also set the OPENAI_API_KEY as a parameter to the script:
curl -sL https://raw.githubusercontent.com/mem0ai/mem0/main/openmemory/run.sh | OPENAI_API_KEY=your_api_key bash
cp api/.env.example api/.env then change OPENAI_API_KEY to yours)Before running the project, you need to configure environment variables for both the API and the UI.
You can do this in one of the following ways:
Manually:
Create a .env file in each of the following directories:
/api/.env/ui/.envUsing .env.example files:
Copy and rename the example files:
cp api/.env.example api/.env
cp ui/.env.example ui/.env
Using Makefile (if supported):
Run:
make env
/api/.envOPENAI_API_KEY=sk-xxx
USER=<user-id> # The User Id you want to associate the memories with
By default, OpenMemory uses OpenAI (gpt-4o-mini) for the LLM and embedder. You can configure a different provider using these environment variables in /api/.env:
| Variable | Description | Default |
|---|---|---|
LLM_PROVIDER | LLM provider (openai, ollama, anthropic, groq, together, deepseek, etc.) | openai |
LLM_MODEL | Model name for the LLM provider | gpt-4o-mini (OpenAI) / llama3.1:latest (Ollama) |
LLM_API_KEY | API key for the LLM provider | OPENAI_API_KEY env var |
LLM_BASE_URL | Custom base URL for the LLM API | Provider default |
OLLAMA_BASE_URL | Ollama-specific base URL (takes precedence over LLM_BASE_URL for Ollama) | http://localhost:11434 |
EMBEDDER_PROVIDER | Embedder provider (defaults to ollama when LLM is Ollama, otherwise openai) | openai |
EMBEDDER_MODEL | Model name for the embedder | text-embedding-3-small (OpenAI) / nomic-embed-text (Ollama) |
EMBEDDER_API_KEY | API key for the embedder provider | OPENAI_API_KEY env var |
EMBEDDER_BASE_URL | Custom base URL for the embedder API | Provider default |
Example: Using Ollama (fully local)
LLM_PROVIDER=ollama
LLM_MODEL=llama3.1:latest
EMBEDDER_PROVIDER=ollama
EMBEDDER_MODEL=nomic-embed-text
OLLAMA_BASE_URL=http://localhost:11434
Example: Using Anthropic
LLM_PROVIDER=anthropic
LLM_MODEL=claude-sonnet-4-20250514
LLM_API_KEY=sk-ant-xxx
/ui/.envNEXT_PUBLIC_API_URL=http://localhost:8765
NEXT_PUBLIC_USER_ID=<user-id> # Same as the user id for environment variable in api
You can run the project using the following two commands:
make build # builds the mcp server and ui
make up # runs openmemory mcp server and ui
After running these commands, you will have:
localhost:3000?If the UI does not start properly on http://localhost:3000, try running it manually:
cd ui
pnpm install
pnpm dev
Use the following one step command to configure OpenMemory Local MCP to a client. The general command format is as follows:
npx @openmemory/install local http://localhost:8765/mcp/<client-name>/sse/<user-id> --client <client-name>
Replace <client-name> with the desired client name and <user-id> with the value specified in your environment variables.
api/ - Backend APIs + MCP serverui/ - Frontend React applicationWe are a team of developers passionate about the future of AI and open-source software. With years of experience in both fields, we believe in the power of community-driven development and are excited to build tools that make AI more accessible and personalized.
We welcome all forms of contributions:
How to contribute:
git checkout -b openmemory/feature/amazing-feature)git commit -m 'Add some amazing feature')git push origin openmemory/feature/amazing-feature)Join us in building the future of AI memory management! Your contributions help make OpenMemory better for everyone.