docs/open-source/features/openai_compatibility.mdx
Mem0 mirrors the OpenAI client interface so you can plug memories into existing chat-completion code with minimal changes. Point your OpenAI-compatible client at Mem0, keep the same request shape, and gain persistent memory between calls.
<Info> **You’ll use this when…** - Your app already relies on OpenAI chat completions and you want Mem0 to feel familiar. - You need to reuse existing middleware that expects OpenAI-compatible responses. - You plan to switch between Mem0 Platform and the self-hosted client without rewriting code. </Info>client.chat.completions.create(...) works the same as OpenAI’s method signatures.messages, model, and optional memory-scoping fields (user_id, agent_id, run_id).from mem0.proxy.main import Mem0
client = Mem0(api_key="m0-xxx")
messages = [
{"role": "user", "content": "I love Indian food but I cannot eat pizza since I'm allergic to cheese."}
]
chat_completion = client.chat.completions.create(
messages=messages,
model="gpt-5-mini",
user_id="alice"
)
from mem0.proxy.main import Mem0
config = {
"vector_store": {
"provider": "qdrant",
"config": {
"host": "localhost",
"port": 6333
}
}
}
client = Mem0(config=config)
chat_completion = client.chat.completions.create(
messages=[{"role": "user", "content": "What's the capital of France?"}],
model="gpt-5-mini"
)
from mem0.proxy.main import Mem0
client = Mem0(api_key="m0-xxx")
# Store preferences
client.chat.completions.create(
messages=[{"role": "user", "content": "I love Indian food but I'm allergic to cheese."}],
model="gpt-5-mini",
user_id="alice"
)
# Later conversation reuses the memory
response = client.chat.completions.create(
messages=[{"role": "user", "content": "Suggest dinner options in San Francisco."}],
model="gpt-5-mini",
user_id="alice"
)
print(response.choices[0].message.content)
choices, usage, etc.).Mem0(api_key=...)) and OSS configurations to ensure both respect the same request body.response.metadata.memories (if enabled) to see which facts the model recalled.| Parameter | Type | Purpose |
|---|---|---|
user_id | str | Associates the conversation with a user so memories persist. |
agent_id | str | Optional agent or bot identifier for multi-agent scenarios. |
run_id | str | Optional session/run identifier for short-lived flows. |
metadata | dict | Store extra fields alongside each memory entry. |
filters | dict | Restrict retrieval to specific memories while responding. |
top_k | int | Cap how many memories Mem0 pulls into the context (default 10). |
Other request fields mirror OpenAI’s chat completion API.