Back to Llama Index

Using MCP Toolbox with LlamaIndex

docs/examples/tools/mcp_toolbox.ipynb

0.14.214.2 KB
Original Source

Using MCP Toolbox with LlamaIndex

Integrate your databases with LlamaIndex agents using MCP Toolbox.

Overview

MCP Toolbox for Databases is an open source MCP server for databases. It was designed with enterprise-grade and production-quality in mind. It enables you to develop tools easier, faster, and more securely by handling the complexities such as connection pooling, authentication, and more.

Toolbox Tools can be seamlessly integrated with LlamaIndex applications. For more information on getting started or configuring MCP Toolbox, see the documentation.

Configure and deploy

Toolbox is an open source server that you deploy and manage yourself. For more instructions on deploying and configuring, see the official Toolbox documentation:

Install client SDK

Install the LlamaIndex compatible MCP Toolbox SDK package before getting started:

python
pip install toolbox-llamaindex

Loading Toolbox Tools

Once your Toolbox server is configured and up and running, you can load tools from your server:

python
import asyncio
import os
from llama_index.core.agent.workflow import AgentWorkflow
from llama_index.core.workflow import Context
from llama_index.llms.google_genai import GoogleGenAI
from toolbox_llamaindex import ToolboxClient

prompt = """
  You're a helpful hotel assistant. You handle hotel searching, booking and
  cancellations. When the user searches for a hotel, mention it's name, id,
  location and price tier. Always mention hotel ids while performing any
  searches. This is very important for any operations. For any bookings or
  cancellations, please provide the appropriate confirmation. Be sure to
  update checkin or checkout dates if mentioned by the user.
  Don't ask for confirmations from the user.
"""

queries = [
    "Find hotels in Basel with Basel in it's name.",
    "Can you book the Hilton Basel for me?",
    "Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
    "My check in dates would be from April 10, 2024 to April 19, 2024.",
]


async def run_application():
    llm = GoogleGenAI(
        api_key=os.getenv("GOOGLE_API_KEY"),
        model="gemini-2.0-flash-001",
    )

    # llm = GoogleGenAI(
    #     model="gemini-2.0-flash-001",
    #     vertexai_config={"project": "project-id", "location": "us-central1"},
    # )

    # Load the tools from the Toolbox server
    async with ToolboxClient("http://127.0.0.1:5000") as client:
        tools = await client.aload_toolset()

        agent = AgentWorkflow.from_tools_or_functions(
            tools,
            llm=llm,
        )

        for tool in tools:
            print(tool.metadata)

        ctx = Context(agent)

        for query in queries:
            response = await agent.run(user_msg=query, ctx=ctx)
            print()
            print(f"---- {query} ----")
            print(str(response))


await run_application()

Advanced Toolbox Features

Toolbox has a variety of features to make developing Gen AI tools for databases. For more information, read more about the following features:

  • Authenticated Parameters: bind tool inputs to values from OIDC tokens automatically, making it easy to run sensitive queries without potentially leaking data
  • Authorized Invocations: restrict access to use a tool based on the users Auth token
  • OpenTelemetry: get metrics and tracing from Toolbox with OpenTelemetry