apps/docs/install.md
You are integrating Supermemory into my application. Supermemory provides user memory, semantic search, and automatic knowledge extraction for AI applications.
You can always reference the documentation by using the SearchSupermemoryDocs MCP or running a web search tool for content on supermemory.ai/docs.
What are you building?
How do you want to integrate?
Data model?
Do you want USER PROFILES? User profiles are automatically-maintained facts about users (what they like, what they're working on, preferences).
How should I retrieve context?
profile({ containerTag, q: userMessage })profile() for facts, search() for memories# Get API key: https://console.supermemory.ai
npm install supermemory # or: pip install supermemory
# For Vercel AI SDK: npm install @supermemory/tools
export SUPERMEMORY_API_KEY="sm_..."
// PATCH https://api.supermemory.ai/v3/settings
fetch('https://api.supermemory.ai/v3/settings', {
method: 'PATCH',
headers: { 'x-supermemory-api-key': process.env.SUPERMEMORY_API_KEY },
body: JSON.stringify({
shouldLLMFilter: true,
filterPrompt: `This is a [your app description]. containerTag is [userId/orgId]. We store [what data].`
})
})
Based on their data model answer:
USER-ONLY APP:
containerTag: userId
ORG-ONLY APP:
containerTag: orgId // Org members share memories
BOTH (ask which):
// Option A: Unique per user-org combination
containerTag: `${userId}-${orgId}`
// Option B: Org-scoped with user metadata
containerTag: orgId, metadata: { userId }
// Option C: User-scoped with org metadata
containerTag: userId, metadata: { orgId }
Based on their integration choice:
import { streamText } from 'ai'
import { anthropic } from '@ai-sdk/anthropic'
import { supermemoryTools } from '@supermemory/tools/ai-sdk'
// Option 1: Agent tools (recommended for agentic flows)
const result = await streamText({
model: anthropic('claude-3-5-sonnet-20241022'),
prompt: userMessage,
tools: supermemoryTools(process.env.SUPERMEMORY_API_KEY, {
containerTags: [userId]
})
})
// Agent gets searchMemories, addMemory, fetchMemory tools
// Option 2: Profile middleware (automatic context injection)
import { withSupermemory } from '@supermemory/tools/ai-sdk'
const modelWithMemory = withSupermemory(anthropic('claude-3-5-sonnet-20241022'), userId)
const result = await generateText({
model: modelWithMemory,
messages: [{ role: 'user', content: userMessage }]
})
// Profile is automatically injected into context
import Supermemory from 'supermemory'
const client = new Supermemory()
// Before each LLM call:
const { profile, searchResults } = await client.profile({
containerTag: userId,
q: userMessage // Include this if they chose OPTION A (one call)
// Omit if they chose OPTION B (separate calls)
})
// Build context
const context = `
Static facts: ${profile.static.join('\n')}
Recent context: ${profile.dynamic.join('\n')}
${searchResults ? `Memories: ${searchResults.results.map(r => r.content).join('\n')}` : ''}
`
// Send to LLM
const messages = [
{ role: 'system', content: `User context:\n${context}` },
{ role: 'user', content: userMessage }
]
// After LLM responds:
await client.memories.add({
content: `user: ${userMessage}\nassistant: ${response}`,
containerTag: userId
})
import Supermemory from 'supermemory'
const client = new Supermemory()
// Search for relevant memories
const results = await client.search({
q: userMessage,
containerTag: userId,
searchMode: 'hybrid', // Searches memories + document chunks
limit: 5
})
// Build context
const context = results.results.map(r => r.content).join('\n')
// Send to LLM with context
const messages = [
{ role: 'system', content: `Relevant context:\n${context}` },
{ role: 'user', content: userMessage }
]
// Store the conversation
await client.memories.add({
content: `user: ${userMessage}\nassistant: ${response}`,
containerTag: userId
})
from supermemory import Supermemory
client = Supermemory()
# With profiles (if they want it)
profile_data = client.profile(
container_tag=user_id,
q=user_message # Include if OPTION A, omit if OPTION B
)
context = f"""
Static: {chr(10).join(profile_data.profile.static)}
Dynamic: {chr(10).join(profile_data.profile.dynamic)}
"""
# Store conversation
client.add(content=f"user: {user_message}\\nassistant: {response}", container_tag=user_id)
# Add memory
curl -X POST https://api.supermemory.ai/v3/documents \
-H "x-supermemory-api-key: $SUPERMEMORY_API_KEY" \
-d '{"content": "conversation", "containerTag": "userId"}'
# Get profile
curl -X POST https://api.supermemory.ai/v4/profile \
-H "x-supermemory-api-key: $SUPERMEMORY_API_KEY" \
-d '{"containerTag": "userId", "q": "search query"}'
# Search
curl -X POST https://api.supermemory.ai/v4/search \
-H "x-supermemory-api-key: $SUPERMEMORY_API_KEY" \
-d '{"q": "query", "containerTag": "userId", "searchMode": "hybrid"}'
// Files are automatically extracted (PDFs, images with OCR, videos with transcription)
const formData = new FormData()
formData.append('file', fileBlob)
formData.append('containerTag', userId)
await fetch('https://api.supermemory.ai/v3/documents/file', {
method: 'POST',
headers: { 'x-supermemory-api-key': process.env.SUPERMEMORY_API_KEY },
body: formData
})
// Processing is async - check status before assuming searchable
// GET /v3/documents/{documentId}
// HYBRID (recommended) - searches memories + document chunks
searchMode: 'hybrid'
// MEMORIES ONLY - just extracted memories, no original text
searchMode: 'memories'
await client.search({
q: query,
containerTag: userId,
filters: {
AND: [
{ key: 'type', value: 'conversation', type: 'string_equal' },
{ key: 'timestamp', value: '2024', type: 'string_contains' }
]
}
})
# 1. Configure settings
curl -X PATCH https://api.supermemory.ai/v3/settings \
-H "x-supermemory-api-key: $SUPERMEMORY_API_KEY" \
-d '{"shouldLLMFilter": true, "filterPrompt": "..."}'
# 2. Add test memory
curl -X POST https://api.supermemory.ai/v3/documents \
-H "x-supermemory-api-key: $SUPERMEMORY_API_KEY" \
-d '{"content": "Test", "containerTag": "test_user"}'
# 3. Get profile
curl -X POST https://api.supermemory.ai/v4/profile \
-H "x-supermemory-api-key: $SUPERMEMORY_API_KEY" \
-d '{"containerTag": "test_user"}'