docs/examples/tools/mcp_toolbox.ipynb
Integrate your databases with LlamaIndex agents using MCP Toolbox.
MCP Toolbox for Databases is an open source MCP server for databases. It was designed with enterprise-grade and production-quality in mind. It enables you to develop tools easier, faster, and more securely by handling the complexities such as connection pooling, authentication, and more.
Toolbox Tools can be seamlessly integrated with LlamaIndex applications. For more information on getting started or configuring MCP Toolbox, see the documentation.
Toolbox is an open source server that you deploy and manage yourself. For more instructions on deploying and configuring, see the official Toolbox documentation:
Install the LlamaIndex compatible MCP Toolbox SDK package before getting started:
pip install toolbox-llamaindex
Once your Toolbox server is configured and up and running, you can load tools from your server:
import asyncio
import os
from llama_index.core.agent.workflow import AgentWorkflow
from llama_index.core.workflow import Context
from llama_index.llms.google_genai import GoogleGenAI
from toolbox_llamaindex import ToolboxClient
prompt = """
You're a helpful hotel assistant. You handle hotel searching, booking and
cancellations. When the user searches for a hotel, mention it's name, id,
location and price tier. Always mention hotel ids while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
"""
queries = [
"Find hotels in Basel with Basel in it's name.",
"Can you book the Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
]
async def run_application():
llm = GoogleGenAI(
api_key=os.getenv("GOOGLE_API_KEY"),
model="gemini-2.0-flash-001",
)
# llm = GoogleGenAI(
# model="gemini-2.0-flash-001",
# vertexai_config={"project": "project-id", "location": "us-central1"},
# )
# Load the tools from the Toolbox server
async with ToolboxClient("http://127.0.0.1:5000") as client:
tools = await client.aload_toolset()
agent = AgentWorkflow.from_tools_or_functions(
tools,
llm=llm,
)
for tool in tools:
print(tool.metadata)
ctx = Context(agent)
for query in queries:
response = await agent.run(user_msg=query, ctx=ctx)
print()
print(f"---- {query} ----")
print(str(response))
await run_application()
Toolbox has a variety of features to make developing Gen AI tools for databases. For more information, read more about the following features: