examples/sampling/README.md
These examples demonstrate FastMCP's sampling API, which allows server tools to request LLM completions from the client.
pip install fastmcp[anthropic]
export ANTHROPIC_API_KEY=your-key
Or run directly with uv:
uv run examples/sampling/text.py
text.py)Basic sampling flow where a server tool requests an LLM completion:
uv run examples/sampling/text.py
structured_output.py)Uses result_type to get validated Pydantic models from the LLM:
uv run examples/sampling/structured_output.py
tool_use.py)Gives the LLM tools to use during sampling (calculator, time, dice):
uv run examples/sampling/tool_use.py
server_fallback.py)Configures a fallback sampling handler on the server, enabling sampling even when clients don't support it:
uv run examples/sampling/server_fallback.py
To use OpenAI instead of Anthropic, change the handler:
from fastmcp.client.sampling.handlers.openai import OpenAISamplingHandler
handler = OpenAISamplingHandler(default_model="gpt-4o-mini")
And install with pip install fastmcp[openai].