apps/docs/supermemory-mcp/mcp.mdx
Supermemory MCP Server 4.0 gives AI assistants (Claude, Cursor, Windsurf, etc.) persistent memory across conversations. Built on Cloudflare Workers with Durable Objects for scalable, persistent connections.
npx -y install-mcp@latest https://mcp.supermemory.ai/mcp --client claude --oauth=yes
Replace claude with your MCP client: cursor, windsurf, vscode, etc.
Add to your MCP client config:
{
"mcpServers": {
"supermemory": {
"url": "https://mcp.supermemory.ai/mcp"
}
}
}
The server uses OAuth by default. Your client will discover the authorization server via /.well-known/oauth-protected-resource and prompt you to authenticate.
If you prefer API keys over OAuth, get one from app.supermemory.ai and pass it in the Authorization header:
{
"mcpServers": {
"supermemory": {
"url": "https://mcp.supermemory.ai/mcp",
"headers": {
"Authorization": "Bearer sm_your_api_key_here"
}
}
}
}
API keys start with sm_ and skip OAuth when provided.
Scope all operations to a specific project with x-sm-project:
{
"mcpServers": {
"supermemory": {
"url": "https://mcp.supermemory.ai/mcp",
"headers": {
"x-sm-project": "your-project-id"
}
}
}
}
memorySave or forget information about the user.
| Parameter | Type | Required | Description |
|---|---|---|---|
content | string | Yes | The memory content to save or forget |
action | "save" | "forget" | No | Default: "save" |
containerTag | string | No | Project tag to scope the memory |
recallSearch memories and get user profile.
| Parameter | Type | Required | Description |
|---|---|---|---|
query | string | Yes | Search query to find relevant memories |
includeProfile | boolean | No | Include user profile summary. Default: true |
containerTag | string | No | Project tag to scope the search |
whoAmIGet the current logged-in user's information. Returns { userId, email, name, client, sessionId }.
| URI | Description |
|---|---|
supermemory://profile | User profile with stable preferences and recent activity |
supermemory://projects | List of available memory projects |
contextInject user profile and preferences as system context for AI conversations. Returns a formatted message with the user's stable preferences and recent activity.
You can access this in Cursor and Claude Code by just doing /context, which will give the LLMs just enough context to use and query supermemory more.
Purpose: Unlike the recall tool (which searches for specific information) or the profile resource (which returns raw data), the context prompt provides a pre-formatted system message designed for context injection at the start of conversations.
| Parameter | Type | Required | Description |
|---|---|---|---|
containerTag | string | No | Project tag to scope the profile (max 128 chars) |
includeRecent | boolean | No | Include recent activity in the profile. Default: true |
Output format:
memory toolincludeRecent is true)When to use:
context prompt for automatic system context injection at conversation startrecall tool when you need to search for specific informationprofile resource when you need raw profile data for custom processing