docs/integrations/pydantic-ai.mdx
Pydantic AI ships a FastMCPToolset that lets a Pydantic AI agent call tools exposed by any MCP server through the FastMCP Client. Because the toolset is built on the FastMCP Client, it works with FastMCP servers as well as any other MCP server, and supports the full range of transports: in-memory, STDIO, Streamable HTTP, and SSE.
This page shows how to point FastMCPToolset at a FastMCP server, with examples for each transport. For the toolset's full API, see the Pydantic AI documentation.
FastMCPToolset lives in pydantic-ai-slim behind the fastmcp optional group:
pip install "pydantic-ai-slim[fastmcp]"
Create a FastMCP server with the tools you want to expose. We'll use a single dice-rolling tool throughout this guide.
import random
from fastmcp import FastMCP
mcp = FastMCP(name="Dice Roller")
@mcp.tool
def roll_dice(n_dice: int) -> list[int]:
"""Roll `n_dice` 6-sided dice and return the results."""
return [random.randint(1, 6) for _ in range(n_dice)]
if __name__ == "__main__":
mcp.run(transport="http", port=8000)
If your FastMCP server lives in the same process as your agent, pass the FastMCP instance directly. The toolset reuses an in-memory transport, which avoids a network round trip and is the fastest option for tests and embedded use.
import asyncio
import random
from fastmcp import FastMCP
from pydantic_ai import Agent
from pydantic_ai.toolsets.fastmcp import FastMCPToolset
mcp = FastMCP(name="Dice Roller")
@mcp.tool
def roll_dice(n_dice: int) -> list[int]:
return [random.randint(1, 6) for _ in range(n_dice)]
toolset = FastMCPToolset(mcp)
agent = Agent("openai:gpt-4.1", toolsets=[toolset])
async def main():
result = await agent.run("Roll 3 dice!")
print(result.output)
if __name__ == "__main__":
asyncio.run(main())
For a remote FastMCP server reachable over HTTP, pass the URL as a string. The toolset infers the Streamable HTTP transport from the URL.
from pydantic_ai import Agent
from pydantic_ai.toolsets.fastmcp import FastMCPToolset
toolset = FastMCPToolset("https://your-server-url.com/mcp")
agent = Agent("openai:gpt-4.1", toolsets=[toolset])
For SSE, use a /sse URL instead.
To launch a FastMCP server as a subprocess, pass a script path and the toolset will use the STDIO transport.
from pydantic_ai import Agent
from pydantic_ai.toolsets.fastmcp import FastMCPToolset
toolset = FastMCPToolset("server.py")
agent = Agent("openai:gpt-4.1", toolsets=[toolset])
You can also pass a StdioTransport directly when you need control over the command, args, or environment.
To wire up multiple servers at once, pass an MCP configuration dictionary. The toolset opens one client per server and exposes all of their tools to the agent.
from pydantic_ai import Agent
from pydantic_ai.toolsets.fastmcp import FastMCPToolset
mcp_config = {
"mcpServers": {
"dice": {"command": "python", "args": ["server.py"]},
"weather": {"url": "https://weather.example.com/mcp"},
}
}
toolset = FastMCPToolset(mcp_config)
agent = Agent("openai:gpt-4.1", toolsets=[toolset])
Because FastMCPToolset wraps a FastMCP Client, it inherits the client's full authentication story. To pass credentials such as a bearer token to a remote server, build a Client (or StreamableHttpTransport) yourself and hand it to the toolset.
from fastmcp import Client
from fastmcp.client.transports import StreamableHttpTransport
from pydantic_ai import Agent
from pydantic_ai.toolsets.fastmcp import FastMCPToolset
transport = StreamableHttpTransport(
url="https://your-server-url.com/mcp",
headers={"Authorization": "Bearer your-access-token"},
)
toolset = FastMCPToolset(Client(transport))
agent = Agent("openai:gpt-4.1", toolsets=[toolset])
For OAuth flows, use FastMCP's OAuth helper when constructing the Client. For server-side token verification, see Token Verification.