docs/getting-started/upgrading/from-low-level-sdk.mdx
If you've been building MCP servers directly on the mcp package's Server class — writing list_tools() and call_tool() handlers, hand-crafting JSON Schema dicts, and wiring up transport boilerplate — this guide is for you. FastMCP replaces all of that machinery with a declarative, Pythonic API where your functions are the protocol surface.
The core idea: instead of telling the SDK what your tools look like and then separately implementing them, you write ordinary Python functions and let FastMCP derive the protocol layer from your code. Type hints become JSON Schema. Docstrings become descriptions. Return values are serialized automatically. The plumbing you wrote to satisfy the protocol just disappears.
<Note> This guide covers upgrading from **v1** of the `mcp` package. We'll provide a separate guide when v2 ships. </Note> <Note> Already using FastMCP 1.0 via `from mcp.server.fastmcp import FastMCP`? Your upgrade is simpler — see the [FastMCP 1.0 upgrade guide](/getting-started/upgrading/from-mcp-sdk) instead. </Note> <Prompt description="Copy this prompt into any LLM along with your server code to get automated upgrade guidance."> You are upgrading an MCP server from the `mcp` package's low-level Server class (v1) to FastMCP 3.0. The server currently uses `mcp.server.Server` (or `mcp.server.lowlevel.server.Server`) with manual handler registration. Analyze the provided code and rewrite it using FastMCP's high-level API. The full guide is at https://gofastmcp.com/getting-started/upgrading/from-low-level-sdk and the complete FastMCP documentation is at https://gofastmcp.com — fetch these for complete context.UPGRADE RULES:
IMPORTS: Replace all mcp.* imports with FastMCP equivalents.
from mcp.server import Server or from mcp.server.lowlevel.server import Server → from fastmcp import FastMCPimport mcp.types as types → remove (not needed for most code)from mcp.server.stdio import stdio_server → remove (handled by mcp.run())from mcp.server.sse import SseServerTransport → remove (handled by mcp.run())SERVER: Replace Server("name") with FastMCP("name").
TOOLS: Replace the list_tools + call_tool handler pair with individual @mcp.tool decorators.
@server.list_tools() handler entirely@server.call_tool() handler entirely@mcp.tool{"type": "integer"} → int, {"type": "string"} → str, {"type": "array", "items": {"type": "string"}} → list[str])str, int, dict, etc.) instead of list[types.TextContent(...)]types.ImageContent or types.EmbeddedResource, use from fastmcp.utilities.types import Image or return the appropriate typeRESOURCES: Replace the list_resources + list_resource_templates + read_resource handler trio with individual @mcp.resource decorators.
@mcp.resource("uri://...")@mcp.resource("uri://{param}/path") with {param} in the URI and a matching function parametermime_type= in the decorator if neededPROMPTS: Replace the list_prompts + get_prompt handler pair with individual @mcp.prompt decorators.
@mcp.promptlist[Message] for multi-message prompts: from fastmcp.prompts import MessageMessage("text") defaults to role="user"; use Message("text", role="assistant") for assistant messagesTRANSPORT: Replace all transport boilerplate with mcp.run().
async with stdio_server() as (r, w): await server.run(r, w, ...) → mcp.run() (stdio is the default)mcp.run(transport="sse", host="...", port=...)mcp.run(transport="http", host="...", port=...)if __name__ == "__main__": mcp.run()CONTEXT: Replace server.request_context with FastMCP's Context parameter.
from fastmcp import Context and add a ctx: Context parameter to any tool that needs itserver.request_context.session.send_log_message(...) → await ctx.info("message") or await ctx.warning("message")await ctx.report_progress(current, total)For each change, show the original code, explain what it did, and provide the FastMCP equivalent. </Prompt>
pip install --upgrade fastmcp
# or
uv add fastmcp
FastMCP includes the mcp package as a transitive dependency, so you don't lose access to anything.
The Server class requires you to choose a transport, connect streams, build initialization options, and run an event loop. FastMCP collapses all of that into a constructor and a run() call.
import asyncio
from mcp.server import Server
from mcp.server.stdio import stdio_server
server = Server("my-server")
# ... register handlers ...
async def main():
async with stdio_server() as (read_stream, write_stream):
await server.run(
read_stream,
write_stream,
server.create_initialization_options(),
)
asyncio.run(main())
from fastmcp import FastMCP
mcp = FastMCP("my-server")
# ... register tools, resources, prompts ...
if __name__ == "__main__":
mcp.run()
Need HTTP instead of stdio? With the Server class, you'd wire up Starlette routes and SseServerTransport or StreamableHTTPSessionManager. With FastMCP:
mcp.run(transport="http", host="0.0.0.0", port=8000)
This is where the difference is most dramatic. The Server class requires two handlers — one to describe your tools (with hand-written JSON Schema) and another to dispatch calls by name. FastMCP eliminates both by deriving everything from your function signature.
import mcp.types as types
from mcp.server import Server
server = Server("math")
@server.list_tools()
async def list_tools() -> list[types.Tool]:
return [
types.Tool(
name="add",
description="Add two numbers",
inputSchema={
"type": "object",
"properties": {
"a": {"type": "number"},
"b": {"type": "number"},
},
"required": ["a", "b"],
},
),
types.Tool(
name="multiply",
description="Multiply two numbers",
inputSchema={
"type": "object",
"properties": {
"a": {"type": "number"},
"b": {"type": "number"},
},
"required": ["a", "b"],
},
),
]
@server.call_tool()
async def call_tool(
name: str, arguments: dict
) -> list[types.TextContent]:
if name == "add":
result = arguments["a"] + arguments["b"]
return [types.TextContent(type="text", text=str(result))]
elif name == "multiply":
result = arguments["a"] * arguments["b"]
return [types.TextContent(type="text", text=str(result))]
raise ValueError(f"Unknown tool: {name}")
from fastmcp import FastMCP
mcp = FastMCP("math")
@mcp.tool
def add(a: float, b: float) -> float:
"""Add two numbers"""
return a + b
@mcp.tool
def multiply(a: float, b: float) -> float:
"""Multiply two numbers"""
return a * b
Each @mcp.tool function is self-contained: its name becomes the tool name, its docstring becomes the description, its type annotations become the JSON Schema, and its return value is serialized automatically. No routing. No schema dictionaries. No content-type wrappers.
When converting your inputSchema to Python type hints:
| JSON Schema | Python Type |
|---|---|
{"type": "string"} | str |
{"type": "number"} | float |
{"type": "integer"} | int |
{"type": "boolean"} | bool |
{"type": "array", "items": {"type": "string"}} | list[str] |
{"type": "object"} | dict |
Optional property (not in required) | param: str | None = None |
With the Server class, tools return list[types.TextContent | types.ImageContent | ...]. In FastMCP, return plain Python values — strings, numbers, dicts, lists, dataclasses, Pydantic models — and serialization is handled for you.
For images or other non-text content, FastMCP provides helpers:
from fastmcp import FastMCP
from fastmcp.utilities.types import Image
mcp = FastMCP("media")
@mcp.tool
def create_chart(data: list[float]) -> Image:
"""Generate a chart from data."""
png_bytes = generate_chart(data) # your logic
return Image(data=png_bytes, format="png")
The Server class uses three handlers for resources: list_resources() to enumerate them, list_resource_templates() for URI templates, and read_resource() to serve content — all with manual routing by URI. FastMCP replaces all three with per-resource decorators.
import json
import mcp.types as types
from mcp.server import Server
from pydantic import AnyUrl
server = Server("data")
@server.list_resources()
async def list_resources() -> list[types.Resource]:
return [
types.Resource(
uri=AnyUrl("config://app"),
name="app_config",
description="Application configuration",
mimeType="application/json",
),
types.Resource(
uri=AnyUrl("config://features"),
name="feature_flags",
description="Active feature flags",
mimeType="application/json",
),
]
@server.list_resource_templates()
async def list_resource_templates() -> list[types.ResourceTemplate]:
return [
types.ResourceTemplate(
uriTemplate="users://{user_id}/profile",
name="user_profile",
description="User profile by ID",
),
types.ResourceTemplate(
uriTemplate="projects://{project_id}/status",
name="project_status",
description="Project status by ID",
),
]
@server.read_resource()
async def read_resource(uri: AnyUrl) -> str:
uri_str = str(uri)
if uri_str == "config://app":
return json.dumps({"debug": False, "version": "1.0"})
if uri_str == "config://features":
return json.dumps({"dark_mode": True, "beta": False})
if uri_str.startswith("users://"):
user_id = uri_str.split("/")[2]
return json.dumps({"id": user_id, "name": f"User {user_id}"})
if uri_str.startswith("projects://"):
project_id = uri_str.split("/")[2]
return json.dumps({"id": project_id, "status": "active"})
raise ValueError(f"Unknown resource: {uri}")
import json
from fastmcp import FastMCP
mcp = FastMCP("data")
@mcp.resource("config://app", mime_type="application/json")
def app_config() -> str:
"""Application configuration"""
return json.dumps({"debug": False, "version": "1.0"})
@mcp.resource("config://features", mime_type="application/json")
def feature_flags() -> str:
"""Active feature flags"""
return json.dumps({"dark_mode": True, "beta": False})
@mcp.resource("users://{user_id}/profile")
def user_profile(user_id: str) -> str:
"""User profile by ID"""
return json.dumps({"id": user_id, "name": f"User {user_id}"})
@mcp.resource("projects://{project_id}/status")
def project_status(project_id: str) -> str:
"""Project status by ID"""
return json.dumps({"id": project_id, "status": "active"})
Static resources and URI templates use the same @mcp.resource decorator — FastMCP detects {placeholders} in the URI and automatically registers a template. The function parameter user_id maps directly to the {user_id} placeholder.
Same pattern: the Server class uses list_prompts() and get_prompt() with manual routing. FastMCP uses one decorator per prompt.
import mcp.types as types
from mcp.server import Server
server = Server("prompts")
@server.list_prompts()
async def list_prompts() -> list[types.Prompt]:
return [
types.Prompt(
name="review_code",
description="Review code for issues",
arguments=[
types.PromptArgument(
name="code",
description="The code to review",
required=True,
),
types.PromptArgument(
name="language",
description="Programming language",
required=False,
),
],
)
]
@server.get_prompt()
async def get_prompt(
name: str, arguments: dict[str, str] | None
) -> types.GetPromptResult:
if name == "review_code":
code = (arguments or {}).get("code", "")
language = (arguments or {}).get("language", "")
lang_note = f" (written in {language})" if language else ""
return types.GetPromptResult(
description="Code review prompt",
messages=[
types.PromptMessage(
role="user",
content=types.TextContent(
type="text",
text=f"Please review this code{lang_note}:\n\n{code}",
),
)
],
)
raise ValueError(f"Unknown prompt: {name}")
from fastmcp import FastMCP
mcp = FastMCP("prompts")
@mcp.prompt
def review_code(code: str, language: str | None = None) -> str:
"""Review code for issues"""
lang_note = f" (written in {language})" if language else ""
return f"Please review this code{lang_note}:\n\n{code}"
Returning a str from a prompt function automatically wraps it as a user message. For multi-turn prompts, return a list[Message]:
from fastmcp import FastMCP
from fastmcp.prompts import Message
mcp = FastMCP("prompts")
@mcp.prompt
def debug_session(error: str) -> list[Message]:
"""Start a debugging conversation"""
return [
Message(f"I'm seeing this error:\n\n{error}"),
Message("I'll help you debug that. Can you share the relevant code?", role="assistant"),
]
The Server class exposes request context through server.request_context, which gives you the raw ServerSession for sending notifications. FastMCP replaces this with a typed Context object injected into any function that declares it.
import mcp.types as types
from mcp.server import Server
server = Server("worker")
@server.call_tool()
async def call_tool(name: str, arguments: dict):
if name == "process_data":
ctx = server.request_context
await ctx.session.send_log_message(
level="info", data="Starting processing..."
)
# ... do work ...
await ctx.session.send_log_message(
level="info", data="Done!"
)
return [types.TextContent(type="text", text="Processed")]
from fastmcp import FastMCP, Context
mcp = FastMCP("worker")
@mcp.tool
async def process_data(ctx: Context) -> str:
"""Process data with progress logging"""
await ctx.info("Starting processing...")
# ... do work ...
await ctx.info("Done!")
return "Processed"
The Context object provides logging (ctx.debug(), ctx.info(), ctx.warning(), ctx.error()), progress reporting (ctx.report_progress()), resource subscriptions, session state, and more. See Context for the full API.
A full server upgrade, showing how all the pieces fit together:
<CodeGroup>import asyncio
import json
import mcp.types as types
from mcp.server import Server
from mcp.server.stdio import stdio_server
from pydantic import AnyUrl
server = Server("demo")
@server.list_tools()
async def list_tools() -> list[types.Tool]:
return [
types.Tool(
name="greet",
description="Greet someone by name",
inputSchema={
"type": "object",
"properties": {
"name": {"type": "string"},
},
"required": ["name"],
},
)
]
@server.call_tool()
async def call_tool(name: str, arguments: dict) -> list[types.TextContent]:
if name == "greet":
return [types.TextContent(type="text", text=f"Hello, {arguments['name']}!")]
raise ValueError(f"Unknown tool: {name}")
@server.list_resources()
async def list_resources() -> list[types.Resource]:
return [
types.Resource(
uri=AnyUrl("info://version"),
name="version",
description="Server version",
)
]
@server.read_resource()
async def read_resource(uri: AnyUrl) -> str:
if str(uri) == "info://version":
return json.dumps({"version": "1.0.0"})
raise ValueError(f"Unknown resource: {uri}")
@server.list_prompts()
async def list_prompts() -> list[types.Prompt]:
return [
types.Prompt(
name="summarize",
description="Summarize text",
arguments=[
types.PromptArgument(name="text", required=True)
],
)
]
@server.get_prompt()
async def get_prompt(
name: str, arguments: dict[str, str] | None
) -> types.GetPromptResult:
if name == "summarize":
return types.GetPromptResult(
description="Summarize text",
messages=[
types.PromptMessage(
role="user",
content=types.TextContent(
type="text",
text=f"Summarize:\n\n{(arguments or {}).get('text', '')}",
),
)
],
)
raise ValueError(f"Unknown prompt: {name}")
async def main():
async with stdio_server() as (read_stream, write_stream):
await server.run(
read_stream, write_stream,
server.create_initialization_options(),
)
asyncio.run(main())
import json
from fastmcp import FastMCP
mcp = FastMCP("demo")
@mcp.tool
def greet(name: str) -> str:
"""Greet someone by name"""
return f"Hello, {name}!"
@mcp.resource("info://version")
def version() -> str:
"""Server version"""
return json.dumps({"version": "1.0.0"})
@mcp.prompt
def summarize(text: str) -> str:
"""Summarize text"""
return f"Summarize:\n\n{text}"
if __name__ == "__main__":
mcp.run()
Once you've upgraded, you have access to everything FastMCP provides beyond the basics:
Explore the full documentation at gofastmcp.com.