examples/mcp-model-context-protocol/README.md
This example shows how to use an MCP (Model Context Protocol) server with TensorZero.
We'll use mcp-clickhouse to build a chatbot that can answer questions about the contents of your ClickHouse database.
[User]
Inspect the schemas and tell me how many inferences I have?
[Tool Call: list_tables]
{"database":"tensorzero"}
[Tool Result]
... redacted for brevity ...
[Tool Call: run_select_query]
{"query":"SELECT count(DISTINCT inference_id) AS total_inferences FROM tensorzero.ModelInference"}
... redacted for brevity ...
[Tool Result]
{"total_inferences": 90}
[Assistant]
You have a total of 90 inferences recorded in the tensorzero.ModelInference table. Let me know if you need inference counts from other related tables or more details.
[!WARNING]
This example is for educational purposes only. The agent is likely to hallucinate and make mistakes without additional context and optimization.
We provide a simple configuration in config/tensorzero.toml.
The configuration specifies a straightforward chat function clickhouse_copilot with a single variant that uses GPT 4.1 Mini.
We provide a sample configuration for the MCP server in config/mcp-clickhouse.toml.
OPENAI_API_KEY environment variable to your OpenAI API key.docker compose up to start TensorZero.uv: uv syncpython main.py