Back to Mem0

Node SDK Quickstart

docs/open-source/node-quickstart.mdx

2.0.17.2 KB
Original Source

Spin up Mem0 with the Node SDK in just a few steps. You’ll install the package, initialize the client, add a memory, and confirm retrieval with a single search.

Prerequisites

  • Node.js 18 or higher
  • (Optional) OpenAI API key stored in your environment when you want to customize providers

Install and run your first memory

<Steps> <Step title="Install the SDK"> ```bash npm install mem0ai ``` </Step> <Step title="Initialize the client"> ```ts import { Memory } from "mem0ai/oss";

const memory = new Memory();

</Step>

<Step title="Add a memory">
```ts
const messages = [
  { role: "user", content: "I'm planning to watch a movie tonight. Any recommendations?" },
  { role: "assistant", content: "How about thriller movies? They can be quite engaging." },
  { role: "user", content: "I'm not a big fan of thriller movies but I love sci-fi movies." },
  { role: "assistant", content: "Got it! I'll avoid thriller recommendations and suggest sci-fi movies in the future." }
];

await memory.add(messages, { userId: "alice", metadata: { category: "movie_recommendations" } });
</Step> <Step title="Search memories"> ```ts const results = await memory.search("What do you know about me?", { filters: { userId: "alice" } }); console.log(results); ```

Output

json
{
  "results": [
    {
      "id": "892db2ae-06d9-49e5-8b3e-585ef9b85b8e",
      "memory": "User is planning to watch a movie tonight.",
      "score": 0.38920719231944799,
      "metadata": {
        "category": "movie_recommendations"
      },
      "userId": "alice"
    }
  ]
}
</Step> </Steps> <Note> By default the Node SDK uses local-friendly settings (OpenAI `gpt-5-mini`, `text-embedding-3-small`, in-memory vector store, and SQLite history). Swap components by passing a config as shown below. </Note>

Configure for production

ts
import { Memory } from "mem0ai/oss";

const memory = new Memory({
  embedder: {
    provider: "openai",
    config: {
      apiKey: process.env.OPENAI_API_KEY || "",
      model: "text-embedding-3-small"
    }
  },
  vectorStore: {
    provider: "memory",
    config: {
      collectionName: "memories",
      dimension: 1536
    }
  },
  llm: {
    provider: "openai",
    config: {
      apiKey: process.env.OPENAI_API_KEY || "",
      model: "gpt-4-turbo-preview"
    }
  },
  historyDbPath: "memory.db"
});

Manage memories (optional)

<CodeGroup> ```ts Get all memories const allMemories = await memory.getAll({ filters: { userId: "alice" } }); console.log(allMemories); ```
ts
const singleMemory = await memory.get("892db2ae-06d9-49e5-8b3e-585ef9b85b8e");
console.log(singleMemory);
ts
const result = await memory.search("What do you know about me?", { filters: { userId: "alice" } });
console.log(result);
ts
const updateResult = await memory.update(
  "892db2ae-06d9-49e5-8b3e-585ef9b85b8e",
  "I love India, it is my favorite country."
);
console.log(updateResult);
</CodeGroup>
ts
// Audit history
const history = await memory.history("892db2ae-06d9-49e5-8b3e-585ef9b85b8e");
console.log(history);

// Delete specific or scoped memories
await memory.delete("892db2ae-06d9-49e5-8b3e-585ef9b85b8e");
await memory.deleteAll({ userId: "alice" });

// Reset everything
await memory.reset();

Use a custom history store

The Node SDK supports Supabase (or other providers) when you need serverless-friendly history storage.

<CodeGroup> ```ts Supabase provider import { Memory } from "mem0ai/oss";

const memory = new Memory({ historyStore: { provider: "supabase", config: { supabaseUrl: process.env.SUPABASE_URL || "", supabaseKey: process.env.SUPABASE_KEY || "", tableName: "memory_history" } } });


```ts Disable history
import { Memory } from "mem0ai/oss";

const memory = new Memory({
  disableHistory: true
});
</CodeGroup>

Create the Supabase table with:

sql
create table memory_history (
  id text primary key,
  memory_id text not null,
  previous_value text,
  new_value text,
  action text not null,
  created_at timestamp with time zone default timezone('utc', now()),
  updated_at timestamp with time zone,
  is_deleted integer default 0
);

Configuration parameters

Mem0 offers granular configuration across vector stores, LLMs, embedders, and history stores.

<AccordionGroup> <Accordion title="Vector store"> | Parameter | Description | Default | | --- | --- | --- | | `provider` | Vector store provider (e.g., `"memory"`) | `"memory"` | | `host` | Host address | `"localhost"` | | `port` | Port number | `undefined` | </Accordion> <Accordion title="LLM"> | Parameter | Description | Provider | | --- | --- | --- | | `provider` | LLM provider (e.g., `"openai"`, `"anthropic"`) | All | | `model` | Model to use | All | | `temperature` | Temperature value | All | | `apiKey` | API key | All | | `maxTokens` | Max tokens to generate | All | | `topP` | Probability threshold | All | | `topK` | Token count to keep | All | | `openaiBaseUrl` | Base URL override | OpenAI | </Accordion> <Accordion title="Embedder"> | Parameter | Description | Default | | --- | --- | --- | | `provider` | Embedding provider | `"openai"` | | `model` | Embedding model | `"text-embedding-3-small"` | | `apiKey` | API key | `undefined` | </Accordion> <Accordion title="General"> | Parameter | Description | Default | | --- | --- | --- | | `historyDbPath` | Path to history database | `"{mem0_dir}/history.db"` | | `customInstructions` | Custom processing prompt | `undefined` | </Accordion> <Accordion title="History store"> | Parameter | Description | Default | | --- | --- | --- | | `provider` | History provider | `"sqlite"` | | `config` | Provider configuration | `undefined` | | `disableHistory` | Disable history store | `false` | </Accordion> <Accordion title="Complete config example"> ```ts const config = { embedder: { provider: "openai", config: { apiKey: process.env.OPENAI_API_KEY || "", model: "text-embedding-3-small" } }, vectorStore: { provider: "memory", config: { collectionName: "memories", dimension: 1536 } }, llm: { provider: "openai", config: { apiKey: process.env.OPENAI_API_KEY || "", model: "gpt-4-turbo-preview" } }, historyStore: { provider: "supabase", config: { supabaseUrl: process.env.SUPABASE_URL || "", supabaseKey: process.env.SUPABASE_KEY || "", tableName: "memories" } }, disableHistory: false, customInstructions: "I'm a virtual assistant. I'm here to help you with your queries." }; ``` </Accordion> </AccordionGroup>

What's next?

<CardGroup cols={3}> <Card title="Explore Memory Operations" icon="database" href="/core-concepts/memory-operations/add"> Review CRUD patterns, filters, and advanced retrieval across the OSS stack. </Card> <Card title="Customize Configuration" icon="sliders" href="/open-source/configuration"> Swap in your preferred LLM, vector store, and history provider for production use. </Card> <Card title="Automate Node Workflows" icon="plug" href="/cookbooks/integrations/openai-tool-calls"> See a full Node-based workflow that layers Mem0 memories onto tool-calling agents. </Card> </CardGroup>

If you have any questions, please feel free to reach out:

<Snippet file="get-help.mdx" />