www/apps/book/app/learn/introduction/build-with-llms-ai/mcp-server/page.mdx
import { Tabs, TabsContent, TabsContentWrapper, TabsList, TabsTrigger, Link, } from "docs-ui"
export const metadata = {
title: ${pageNumber} Medusa MCP Remote Server,
}
The Medusa documentation provides a remote Model Context Protocol (MCP) server that allows you to find information from the Medusa documentation right in your IDEs or AI tools, such as Cursor.
Medusa hosts a Streamable HTTP MCP server at https://docs.medusajs.com/mcp that you can add to AI agents supporting MCP server connections.
To connect to the Medusa MCP server in Claude Code, run the following command in your terminal:
claude mcp add --transport http medusa https://docs.medusajs.com/mcp
To manually connect to the Medusa MCP server in Cursor, add the following to your .cursor/mcp.json file or Cursor settings, as explained in the Cursor documentation:
{
"mcpServers": {
"medusa": {
"url": "https://docs.medusajs.com/mcp"
}
}
}
To manually connect to the Medusa MCP server in VSCode, add the following to your .vscode/mcp.json file in your workspace:
{
"servers": {
"medusa": {
"type": "http",
"url": "https://docs.medusajs.com/mcp"
}
}
}
After connecting to the Medusa MCP server in your AI tool or IDE, you can start asking questions or requesting your AI assistant to build Medusa customizations. It will fetch the relevant information from the Medusa documentation and provide you with accurate answers, code snippets, and explanations.
For example, you can ask: