Back to Omi

Model Context Protocol

docs/doc/developer/MCP.mdx

3.0.0-Android-App6.7 KB
Original Source

Overview

The Omi MCP server enables AI assistants like Claude to interact with your Omi data using natural language. Query your memories, manage conversations, and automate workflows through AI-powered tools.

<CardGroup cols={3}> <Card title="Read Memories" icon="brain"> Retrieve and search your memories </Card> <Card title="Manage Data" icon="pen-to-square"> Create, edit, and delete memories </Card> <Card title="Access Conversations" icon="comments"> Browse full conversation transcripts </Card> </CardGroup>

Hosted MCP Server

The easiest way to specific Omi as an MCP server is to use our hosted endpoint with SSE (Server-Sent Events).

<Steps> <Step title="Generate an API Key" icon="key"> Open the Omi app and navigate to **Settings → Developer → MCP** to generate your API key. </Step> <Step title="Configure your Client" icon="sliders"> Use the following connection details in your MCP client (like [Poke](https://poke.com)):
- **Server URL:** `https://api.omi.me/v1/mcp/sse` (or shown custom URL)
- **API Key:** `omi_mcp_...` (your generated key)

### Example: Poke

</Step> </Steps>

Using Docker MCP Server

If you prefer to run the MCP server locally:

<Steps> <Step title="Generate an API Key" icon="key"> Open the Omi app and navigate to **Settings → Developer → MCP** to generate your API key. </Step> <Step title="Install Docker" icon="docker"> Install Docker to run the MCP server. We recommend [OrbStack](https://orbstack.dev/) for macOS. </Step> <Step title="Configure Claude Desktop" icon="gear"> Add the following to your `claude_desktop_config.json`:
```json
"mcpServers": {
  "omi": {
    "command": "docker",
    "args": ["run", "--rm", "-i", "-e", "OMI_API_KEY=your_api_key_here", "omiai/mcp-server"]
  }
}
```

Replace `your_api_key_here` with the key you generated in Step 1.
</Step> </Steps> <Tip> The API key can also be provided with each tool call. If not provided, the server uses the `OMI_API_KEY` environment variable as a fallback. </Tip>

Available Tools

<AccordionGroup> <Accordion title="get_memories" icon="brain"> Retrieve a list of user memories.
**Parameters:**
| Name | Type | Required | Description |
|------|------|----------|-------------|
| `limit` | number | No | Maximum number of memories to retrieve (default: 100) |
| `categories` | array | No | Categories to filter by (default: []) |

**Returns:** JSON object containing list of memories
</Accordion> <Accordion title="create_memory" icon="plus"> Create a new memory.
**Parameters:**
| Name | Type | Required | Description |
|------|------|----------|-------------|
| `content` | string | Yes | Content of the memory |
| `category` | MemoryFilterOptions | Yes | Category of the memory |

**Returns:** Created memory object
</Accordion> <Accordion title="edit_memory" icon="pen"> Edit an existing memory's content.
**Parameters:**
| Name | Type | Required | Description |
|------|------|----------|-------------|
| `memory_id` | string | Yes | ID of the memory to edit |
| `content` | string | Yes | New content for the memory |

**Returns:** Status of the operation
</Accordion> <Accordion title="delete_memory" icon="trash"> Delete a memory by ID.
**Parameters:**
| Name | Type | Required | Description |
|------|------|----------|-------------|
| `memory_id` | string | Yes | ID of the memory to delete |

**Returns:** Status of the operation
</Accordion> <Accordion title="get_conversations" icon="comments"> Retrieve a list of user conversations.
**Parameters:**
| Name | Type | Required | Description |
|------|------|----------|-------------|
| `include_discarded` | boolean | No | Include discarded conversations (default: false) |
| `limit` | number | No | Maximum number of conversations (default: 25) |

**Returns:** List of conversation objects containing transcripts, timestamps, geolocation, and structured summaries
</Accordion> </AccordionGroup>

Example Integrations

<CardGroup cols={3}> <Card title="LangChain" icon="link" href="https://github.com/BasedHardware/omi/tree/main/mcp/examples"> Build chains with Omi data </Card> <Card title="OpenAI Agents" icon="robot" href="https://github.com/BasedHardware/omi/tree/main/mcp/examples"> Create AI agents using Omi </Card> <Card title="DSPy" icon="code" href="https://github.com/BasedHardware/omi/tree/main/mcp/examples"> Programmatic LLM pipelines </Card> </CardGroup>

Debugging

<Tabs> <Tab title="MCP Inspector" icon="magnifying-glass"> Use the MCP inspector to debug the server:
```bash
npx @modelcontextprotocol/inspector uvx mcp-server-omi
```

For local development:

```bash
cd path/to/servers/src/omi
npx @modelcontextprotocol/inspector uv run mcp-server-omi
```
</Tab> <Tab title="Log Files" icon="file-lines"> View server logs to debug issues:
```bash
tail -n 20 -f ~/Library/Logs/Claude/mcp-server-omi.log
```
</Tab> </Tabs>

Advanced Configuration

Custom Backend URL

If you are self-hosting the Omi backend, specify the API endpoint by setting the OMI_API_BASE_URL environment variable:

bash
export OMI_API_BASE_URL="https://your-backend-url.com"
<Note> This is only needed for self-hosted Omi instances. The default URL points to the official Omi API. </Note>

Comparison with Developer API

FeatureMCPDeveloper API
PurposeAI assistant integrationDirect HTTP API access
AccessRead/write with AI contextRead & write user data
Use CaseClaude Desktop, AI agentsCustom apps, dashboards
Best ForAI-powered workflowsBatch operations, integrations
<Info> For programmatic access without AI assistants, use the [Developer API](/doc/developer/api) instead. </Info>
<CardGroup cols={2}> <Card title="Developer API" icon="code" href="/doc/developer/api"> Direct HTTP API for programmatic access </Card> <Card title="GitHub Examples" icon="github" href="https://github.com/BasedHardware/omi/tree/main/mcp/examples"> LangChain, OpenAI Agents, and DSPy examples </Card> <Card title="Discord Community" icon="discord" href="http://discord.omi.me"> Get help from the community </Card> <Card title="Contribution Guide" icon="code-branch" href="/doc/developer/Contribution"> Contribute to the MCP server </Card> </CardGroup>