Back to Ai

Mem0

content/providers/03-community-providers/27-mem0.mdx

2.1.106.4 KB
Original Source

Mem0 Provider

The Mem0 Provider is a library developed by Mem0 to integrate with the AI SDK. This library brings enhanced AI interaction capabilities to your applications by introducing persistent memory functionality.

<Note type="info"> 🎉 Exciting news! Mem0 AI SDK now supports <strong>Tools Call</strong>. </Note>

Setup

The Mem0 provider is available in the @mem0/vercel-ai-provider module. You can install it with:

<Tabs items={['pnpm', 'npm', 'yarn', 'bun']}> <Tab> <Snippet text="pnpm add @mem0/vercel-ai-provider" dark /> </Tab> <Tab> <Snippet text="npm install @mem0/vercel-ai-provider" dark /> </Tab> <Tab> <Snippet text="yarn add @mem0/vercel-ai-provider" dark /> </Tab> <Tab> <Snippet text="bun add @mem0/vercel-ai-provider" dark /> </Tab> </Tabs>

Provider Instance

First, get your Mem0 API Key from the Mem0 Dashboard.

Then initialize the Mem0 Client in your application:

ts
import { createMem0 } from '@mem0/vercel-ai-provider';

const mem0 = createMem0({
  provider: 'openai',
  mem0ApiKey: 'm0-xxx',
  apiKey: 'provider-api-key',
  config: {
    // Configure the LLM Provider here
  },
  // Optional Mem0 Global Config
  mem0Config: {
    user_id: 'mem0-user-id',
    enable_graph: true,
  },
});
<Note> The `openai` provider is set as default. Consider using `MEM0_API_KEY` and `OPENAI_API_KEY` as environment variables for security. </Note> <Note> The `mem0Config` is optional. It is used to set the global config for the Mem0 Client (eg. `user_id`, `agent_id`, `app_id`, `run_id`, `org_id`, `project_id` etc). </Note>
  • Add Memories to Enhance Context:
ts
import { LanguageModelV4Prompt } from '@ai-sdk/provider';
import { addMemories } from '@mem0/vercel-ai-provider';

const messages: LanguageModelV4Prompt = [
  { role: 'user', content: [{ type: 'text', text: 'I love red cars.' }] },
];

await addMemories(messages, { user_id: 'borat' });

Features

Adding and Retrieving Memories

  • retrieveMemories(): Retrieves memory context for prompts.
  • getMemories(): Get memories from your profile in array format.
  • addMemories(): Adds user memories to enhance contextual responses.
ts
await addMemories(messages, {
  user_id: 'borat',
  mem0ApiKey: 'm0-xxx',
});
await retrieveMemories(prompt, {
  user_id: 'borat',
  mem0ApiKey: 'm0-xxx',
});
await getMemories(prompt, {
  user_id: 'borat',
  mem0ApiKey: 'm0-xxx',
});
<Note> For standalone features, such as `addMemories`, `retrieveMemories`, and `getMemories`, you must either set `MEM0_API_KEY` as an environment variable or pass it directly in the function call. </Note> <Note> `getMemories` will return raw memories in the form of an array of objects, while `retrieveMemories` will return a response in string format with a system prompt ingested with the retrieved memories. </Note>

Generate Text with Memory Context

You can use language models from OpenAI, Anthropic, Cohere, and Groq to generate text with the generateText function:

ts
import { generateText } from 'ai';
import { createMem0 } from '@mem0/vercel-ai-provider';

const mem0 = createMem0();

const { text } = await generateText({
  model: mem0('gpt-4.1', { user_id: 'borat' }),
  prompt: 'Suggest me a good car to buy!',
});

Structured Message Format with Memory

ts
import { generateText } from 'ai';
import { createMem0 } from '@mem0/vercel-ai-provider';

const mem0 = createMem0();

const { text } = await generateText({
  model: mem0('gpt-4.1', { user_id: 'borat' }),
  messages: [
    {
      role: 'user',
      content: [
        { type: 'text', text: 'Suggest me a good car to buy.' },
        { type: 'text', text: 'Why is it better than the other cars for me?' },
      ],
    },
  ],
});

Streaming Responses with Memory Context

ts
import { streamText } from 'ai';
import { createMem0 } from '@mem0/vercel-ai-provider';

const mem0 = createMem0();

const { textStream } = streamText({
  model: mem0('gpt-4.1', {
    user_id: 'borat',
  }),
  prompt:
    'Suggest me a good car to buy! Why is it better than the other cars for me? Give options for every price range.',
});

for await (const textPart of textStream) {
  process.stdout.write(textPart);
}

Generate Responses with Tools Call

ts
import { generateText } from 'ai';
import { createMem0 } from '@mem0/vercel-ai-provider';
import { z } from 'zod';

const mem0 = createMem0({
  provider: 'anthropic',
  apiKey: 'anthropic-api-key',
  mem0Config: {
    // Global User ID
    user_id: 'borat',
  },
});

const prompt = 'What the temperature in the city that I live in?';

const result = await generateText({
  model: mem0('claude-3-5-sonnet-20240620'),
  tools: {
    weather: tool({
      description: 'Get the weather in a location',
      parameters: z.object({
        location: z.string().describe('The location to get the weather for'),
      }),
      execute: async ({ location }) => ({
        location,
        temperature: 72 + Math.floor(Math.random() * 21) - 10,
      }),
    }),
  },
  prompt: prompt,
});

console.log(result);

Get sources from memory

ts
const { text, sources } = await generateText({
  model: mem0('gpt-4.1'),
  prompt: 'Suggest me a good car to buy!',
});

console.log(sources);

This same functionality is available in the streamText function.

Supported LLM Providers

The Mem0 provider supports the following LLM providers:

ProviderConfiguration Value
OpenAIopenai
Anthropicanthropic
Googlegoogle
Groqgroq
Coherecohere

Best Practices

  • User Identification: Use a unique user_id for consistent memory retrieval.
  • Memory Cleanup: Regularly clean up unused memory data.
<Note> We also have support for `agent_id`, `app_id`, and `run_id`. Refer [Docs](https://docs.mem0.ai/api-reference/memory/add-memories). </Note>

Help

  • For more details on Vercel AI SDK, visit the Vercel AI SDK documentation.
  • For Mem0 documentation, refer to the Mem0 Platform.
  • If you need further assistance, please feel free to reach out to us through following methods:

References