docs/open-source/node-quickstart.mdx
Spin up Mem0 with the Node SDK in just a few steps. You’ll install the package, initialize the client, add a memory, and confirm retrieval with a single search.
const memory = new Memory();
</Step>
<Step title="Add a memory">
```ts
const messages = [
{ role: "user", content: "I'm planning to watch a movie tonight. Any recommendations?" },
{ role: "assistant", content: "How about thriller movies? They can be quite engaging." },
{ role: "user", content: "I'm not a big fan of thriller movies but I love sci-fi movies." },
{ role: "assistant", content: "Got it! I'll avoid thriller recommendations and suggest sci-fi movies in the future." }
];
await memory.add(messages, { userId: "alice", metadata: { category: "movie_recommendations" } });
Output
{
"results": [
{
"id": "892db2ae-06d9-49e5-8b3e-585ef9b85b8e",
"memory": "User is planning to watch a movie tonight.",
"score": 0.38920719231944799,
"metadata": {
"category": "movie_recommendations"
},
"userId": "alice"
}
]
}
import { Memory } from "mem0ai/oss";
const memory = new Memory({
embedder: {
provider: "openai",
config: {
apiKey: process.env.OPENAI_API_KEY || "",
model: "text-embedding-3-small"
}
},
vectorStore: {
provider: "memory",
config: {
collectionName: "memories",
dimension: 1536
}
},
llm: {
provider: "openai",
config: {
apiKey: process.env.OPENAI_API_KEY || "",
model: "gpt-4-turbo-preview"
}
},
historyDbPath: "memory.db"
});
const singleMemory = await memory.get("892db2ae-06d9-49e5-8b3e-585ef9b85b8e");
console.log(singleMemory);
const result = await memory.search("What do you know about me?", { filters: { userId: "alice" } });
console.log(result);
const updateResult = await memory.update(
"892db2ae-06d9-49e5-8b3e-585ef9b85b8e",
"I love India, it is my favorite country."
);
console.log(updateResult);
// Audit history
const history = await memory.history("892db2ae-06d9-49e5-8b3e-585ef9b85b8e");
console.log(history);
// Delete specific or scoped memories
await memory.delete("892db2ae-06d9-49e5-8b3e-585ef9b85b8e");
await memory.deleteAll({ userId: "alice" });
// Reset everything
await memory.reset();
The Node SDK supports Supabase (or other providers) when you need serverless-friendly history storage.
<CodeGroup> ```ts Supabase provider import { Memory } from "mem0ai/oss";const memory = new Memory({ historyStore: { provider: "supabase", config: { supabaseUrl: process.env.SUPABASE_URL || "", supabaseKey: process.env.SUPABASE_KEY || "", tableName: "memory_history" } } });
```ts Disable history
import { Memory } from "mem0ai/oss";
const memory = new Memory({
disableHistory: true
});
Create the Supabase table with:
create table memory_history (
id text primary key,
memory_id text not null,
previous_value text,
new_value text,
action text not null,
created_at timestamp with time zone default timezone('utc', now()),
updated_at timestamp with time zone,
is_deleted integer default 0
);
Mem0 offers granular configuration across vector stores, LLMs, embedders, and history stores.
<AccordionGroup> <Accordion title="Vector store"> | Parameter | Description | Default | | --- | --- | --- | | `provider` | Vector store provider (e.g., `"memory"`) | `"memory"` | | `host` | Host address | `"localhost"` | | `port` | Port number | `undefined` | </Accordion> <Accordion title="LLM"> | Parameter | Description | Provider | | --- | --- | --- | | `provider` | LLM provider (e.g., `"openai"`, `"anthropic"`) | All | | `model` | Model to use | All | | `temperature` | Temperature value | All | | `apiKey` | API key | All | | `maxTokens` | Max tokens to generate | All | | `topP` | Probability threshold | All | | `topK` | Token count to keep | All | | `openaiBaseUrl` | Base URL override | OpenAI | </Accordion> <Accordion title="Embedder"> | Parameter | Description | Default | | --- | --- | --- | | `provider` | Embedding provider | `"openai"` | | `model` | Embedding model | `"text-embedding-3-small"` | | `apiKey` | API key | `undefined` | </Accordion> <Accordion title="General"> | Parameter | Description | Default | | --- | --- | --- | | `historyDbPath` | Path to history database | `"{mem0_dir}/history.db"` | | `customInstructions` | Custom processing prompt | `undefined` | </Accordion> <Accordion title="History store"> | Parameter | Description | Default | | --- | --- | --- | | `provider` | History provider | `"sqlite"` | | `config` | Provider configuration | `undefined` | | `disableHistory` | Disable history store | `false` | </Accordion> <Accordion title="Complete config example"> ```ts const config = { embedder: { provider: "openai", config: { apiKey: process.env.OPENAI_API_KEY || "", model: "text-embedding-3-small" } }, vectorStore: { provider: "memory", config: { collectionName: "memories", dimension: 1536 } }, llm: { provider: "openai", config: { apiKey: process.env.OPENAI_API_KEY || "", model: "gpt-4-turbo-preview" } }, historyStore: { provider: "supabase", config: { supabaseUrl: process.env.SUPABASE_URL || "", supabaseKey: process.env.SUPABASE_KEY || "", tableName: "memories" } }, disableHistory: false, customInstructions: "I'm a virtual assistant. I'm here to help you with your queries." }; ``` </Accordion> </AccordionGroup>If you have any questions, please feel free to reach out:
<Snippet file="get-help.mdx" />