docs/integrations/vercel-ai-sdk.mdx
The Mem0 AI SDK Provider is a library developed by Mem0 to integrate with the Vercel AI SDK. This library brings enhanced AI interaction capabilities to your applications by introducing persistent memory functionality.
<Note type="info"> Mem0 AI SDK now supports <strong>Vercel AI SDK V5</strong>. </Note>Install the SDK provider using npm:
npm install @mem0/vercel-ai-provider
Get your Mem0 API Key from the <a href="https://app.mem0.ai/dashboard/api-keys?utm_source=oss&utm_medium=integration-vercel-ai-sdk" rel="nofollow">Mem0 Dashboard</a>.
Initialize the Mem0 Client in your application:
import { createMem0 } from "@mem0/vercel-ai-provider";
const mem0 = createMem0({
provider: "openai",
mem0ApiKey: "m0-xxx",
apiKey: "provider-api-key",
config: {
// Options for LLM Provider
},
// Optional Mem0 Global Config
mem0Config: {
user_id: "mem0-user-id",
},
});
Note: The
openaiprovider is set as default. Consider usingMEM0_API_KEYandOPENAI_API_KEYas environment variables for security.
Note: The
mem0Configis optional. It is used to set the global config for the Mem0 Client (eg.user_id,agent_id,app_id,run_idetc).
Add Memories to Enhance Context:
import { LanguageModelV2Prompt } from "@ai-sdk/provider";
import { addMemories } from "@mem0/vercel-ai-provider";
const messages: LanguageModelV2Prompt = [
{ role: "user", content: [{ type: "text", text: "I love red cars." }] },
];
await addMemories(messages, { user_id: "borat" });
```typescript
await addMemories(messages, { user_id: "borat", mem0ApiKey: "m0-xxx" });
await retrieveMemories(prompt, { user_id: "borat", mem0ApiKey: "m0-xxx" });
await getMemories(prompt, { user_id: "borat", mem0ApiKey: "m0-xxx" });
```
> For standalone features, such as `addMemories`, `retrieveMemories`, and `getMemories`, you must either set `MEM0_API_KEY` as an environment variable or pass it directly in the function call.
> `getMemories` will return raw memories in the form of an array of objects, while `retrieveMemories` will return a response in string format with a system prompt ingested with the retrieved memories.
> `getMemories` returns an array of memory objects.
```typescript
import { generateText } from "ai";
import { createMem0 } from "@mem0/vercel-ai-provider";
const mem0 = createMem0();
const { text } = await generateText({
model: mem0("gpt-4-turbo", { user_id: "borat" }),
prompt: "Suggest me a good car to buy!",
});
```
```typescript
import { generateText } from "ai";
import { openai } from "@ai-sdk/openai";
import { retrieveMemories } from "@mem0/vercel-ai-provider";
const prompt = "Suggest me a good car to buy.";
const memories = await retrieveMemories(prompt, { user_id: "borat" });
const { text } = await generateText({
model: openai("gpt-4-turbo"),
prompt: prompt,
system: memories,
});
```
```typescript
import { generateText } from "ai";
import { createMem0 } from "@mem0/vercel-ai-provider";
const mem0 = createMem0();
const { text } = await generateText({
model: mem0("gpt-4-turbo", { user_id: "borat" }),
messages: [
{
role: "user",
content: [
{ type: "text", text: "Suggest me a good car to buy." },
{ type: "text", text: "Why is it better than the other cars for me?" },
],
},
],
});
```
```typescript
import { streamText } from "ai";
import { createMem0 } from "@mem0/vercel-ai-provider";
const mem0 = createMem0();
const { textStream } = streamText({
model: mem0("gpt-4-turbo", {
user_id: "borat",
}),
prompt: "Suggest me a good car to buy! Why is it better than the other cars for me? Give options for every price range.",
});
for await (const textPart of textStream) {
process.stdout.write(textPart);
}
```
```typescript
import { generateText } from "ai";
import { createMem0 } from "@mem0/vercel-ai-provider";
import { z } from "zod";
const mem0 = createMem0({
provider: "anthropic",
apiKey: "anthropic-api-key",
mem0Config: {
// Global User ID
user_id: "borat"
}
});
const prompt = "What the temperature in the city that I live in?"
const result = await generateText({
model: mem0('claude-3-5-sonnet-20240620'),
tools: {
weather: tool({
description: 'Get the weather in a location',
parameters: z.object({
location: z.string().describe('The location to get the weather for'),
}),
execute: async ({ location }) => ({
location,
temperature: 72 + Math.floor(Math.random() * 21) - 10,
}),
}),
},
prompt: prompt,
});
console.log(result);
```
const { text, sources } = await generateText({
model: mem0("gpt-4-turbo"),
prompt: "Suggest me a good car to buy!",
});
console.log(sources);
The same can be done for streamText as well.
Mem0 AI SDK supports file processing with memory context. Here's an example of analyzing a PDF file:
import { streamText } from "ai";
import { createMem0 } from "@mem0/vercel-ai-provider";
import { readFileSync } from 'fs';
import { join } from 'path';
const mem0 = createMem0({
provider: "google",
mem0ApiKey: "m0-xxx",
config: {
apiKey: "google-api-key"
},
mem0Config: {
user_id: "alice",
},
});
async function main() {
// Read the PDF file
const filePath = join(process.cwd(), 'my_pdf.pdf');
const fileBuffer = readFileSync(filePath);
// Convert the file's arrayBuffer to a Base64 data URL
const arrayBuffer = fileBuffer.buffer.slice(fileBuffer.byteOffset, fileBuffer.byteOffset + fileBuffer.byteLength);
const uint8Array = new Uint8Array(arrayBuffer);
// Convert Uint8Array to an array of characters
const charArray = Array.from(uint8Array, byte => String.fromCharCode(byte));
const binaryString = charArray.join('');
const base64Data = Buffer.from(binaryString, 'binary').toString('base64');
const fileDataUrl = `data:application/pdf;base64,${base64Data}`;
const { textStream } = streamText({
model: mem0("gemini-2.5-flash"),
messages: [
{
role: 'user',
content: [
{
type: 'text',
text: 'Analyze the following PDF and generate a summary.',
},
{
type: 'file',
data: fileDataUrl,
mediaType: 'application/pdf',
},
],
},
],
});
for await (const textPart of textStream) {
process.stdout.write(textPart);
}
}
main();
Note: File support is available with providers that support multimodal capabilities like Google's Gemini models. The example shows how to process PDF files, but you can also work with images, text files, and other supported formats.
| Provider | Configuration Value |
|---|---|
| OpenAI | openai |
| Anthropic | anthropic |
| Groq | groq |
Note: You can use
@ai-sdk/googlepackage.
createMem0(): Initializes a new Mem0 provider instance.retrieveMemories(): Retrieves memory context for prompts.getMemories(): Get memories from your profile in array format.addMemories(): Adds user memories to enhance contextual responses.User Identification: Use a unique user_id for consistent memory retrieval.
Memory Cleanup: Regularly clean up unused memory data.
Note: We also have support for
agent_id,app_id, andrun_id. Refer Docs.
Mem0's Vercel AI SDK enables the creation of intelligent, context-aware applications with persistent memory and seamless integration.
<CardGroup cols={2}> <Card title="OpenAI Agents SDK" icon="cube" href="/integrations/openai-agents-sdk"> Build agents with OpenAI SDK and Mem0 </Card> <Card title="Mastra Integration" icon="star" href="/integrations/mastra"> Create intelligent agents with Mastra framework </Card> </CardGroup>