content/providers/03-community-providers/27-mem0.mdx
The Mem0 Provider is a library developed by Mem0 to integrate with the AI SDK. This library brings enhanced AI interaction capabilities to your applications by introducing persistent memory functionality.
<Note type="info"> 🎉 Exciting news! Mem0 AI SDK now supports <strong>Tools Call</strong>. </Note>The Mem0 provider is available in the @mem0/vercel-ai-provider module. You can install it with:
<Tabs items={['pnpm', 'npm', 'yarn', 'bun']}> <Tab> <Snippet text="pnpm add @mem0/vercel-ai-provider" dark /> </Tab> <Tab> <Snippet text="npm install @mem0/vercel-ai-provider" dark /> </Tab> <Tab> <Snippet text="yarn add @mem0/vercel-ai-provider" dark /> </Tab> <Tab> <Snippet text="bun add @mem0/vercel-ai-provider" dark /> </Tab> </Tabs>
First, get your Mem0 API Key from the Mem0 Dashboard.
Then initialize the Mem0 Client in your application:
import { createMem0 } from '@mem0/vercel-ai-provider';
const mem0 = createMem0({
provider: 'openai',
mem0ApiKey: 'm0-xxx',
apiKey: 'provider-api-key',
config: {
// Configure the LLM Provider here
},
// Optional Mem0 Global Config
mem0Config: {
user_id: 'mem0-user-id',
enable_graph: true,
},
});
import { LanguageModelV4Prompt } from '@ai-sdk/provider';
import { addMemories } from '@mem0/vercel-ai-provider';
const messages: LanguageModelV4Prompt = [
{ role: 'user', content: [{ type: 'text', text: 'I love red cars.' }] },
];
await addMemories(messages, { user_id: 'borat' });
retrieveMemories(): Retrieves memory context for prompts.getMemories(): Get memories from your profile in array format.addMemories(): Adds user memories to enhance contextual responses.await addMemories(messages, {
user_id: 'borat',
mem0ApiKey: 'm0-xxx',
});
await retrieveMemories(prompt, {
user_id: 'borat',
mem0ApiKey: 'm0-xxx',
});
await getMemories(prompt, {
user_id: 'borat',
mem0ApiKey: 'm0-xxx',
});
You can use language models from OpenAI, Anthropic, Cohere, and Groq to generate text with the generateText function:
import { generateText } from 'ai';
import { createMem0 } from '@mem0/vercel-ai-provider';
const mem0 = createMem0();
const { text } = await generateText({
model: mem0('gpt-4.1', { user_id: 'borat' }),
prompt: 'Suggest me a good car to buy!',
});
import { generateText } from 'ai';
import { createMem0 } from '@mem0/vercel-ai-provider';
const mem0 = createMem0();
const { text } = await generateText({
model: mem0('gpt-4.1', { user_id: 'borat' }),
messages: [
{
role: 'user',
content: [
{ type: 'text', text: 'Suggest me a good car to buy.' },
{ type: 'text', text: 'Why is it better than the other cars for me?' },
],
},
],
});
import { streamText } from 'ai';
import { createMem0 } from '@mem0/vercel-ai-provider';
const mem0 = createMem0();
const { textStream } = streamText({
model: mem0('gpt-4.1', {
user_id: 'borat',
}),
prompt:
'Suggest me a good car to buy! Why is it better than the other cars for me? Give options for every price range.',
});
for await (const textPart of textStream) {
process.stdout.write(textPart);
}
import { generateText } from 'ai';
import { createMem0 } from '@mem0/vercel-ai-provider';
import { z } from 'zod';
const mem0 = createMem0({
provider: 'anthropic',
apiKey: 'anthropic-api-key',
mem0Config: {
// Global User ID
user_id: 'borat',
},
});
const prompt = 'What the temperature in the city that I live in?';
const result = await generateText({
model: mem0('claude-3-5-sonnet-20240620'),
tools: {
weather: tool({
description: 'Get the weather in a location',
parameters: z.object({
location: z.string().describe('The location to get the weather for'),
}),
execute: async ({ location }) => ({
location,
temperature: 72 + Math.floor(Math.random() * 21) - 10,
}),
}),
},
prompt: prompt,
});
console.log(result);
const { text, sources } = await generateText({
model: mem0('gpt-4.1'),
prompt: 'Suggest me a good car to buy!',
});
console.log(sources);
This same functionality is available in the streamText function.
The Mem0 provider supports the following LLM providers:
| Provider | Configuration Value |
|---|---|
| OpenAI | openai |
| Anthropic | anthropic |
google | |
| Groq | groq |
| Cohere | cohere |
user_id for consistent memory retrieval.