DEVELOPER.md
This document provides instructions for setting up your development environment and contributing to the Toolbox project.
Before you begin, ensure you have the following:
Databases: Set up the necessary databases for your development environment.
Go: Install the latest version of Go.
Dependencies: Download and manage project dependencies:
go get
go mod tidy
Configuration: Create a tools.yaml file to configure your sources and
tools. See the Configuration section in the
README for details.
CLI Flags: List available command-line flags for the Toolbox server:
go run . --help
Running the Server: Start the Toolbox server with optional flags. The server listens on port 5000 by default.
go run .
Testing the Endpoint: Verify the server is running by sending a request to the endpoint:
curl http://127.0.0.1:5000
Most developers work in a Unix or Unix-like environment.
Compiling for Windows requires the download of zig to provide a C and C++ compiler. These instructions are for cross compiling from Linux x86 but should work for macOS with small changes.
Download zig for your platform.
cd $HOME
curl -fL "https://ziglang.org/download/0.15.2/zig-x86_64-linux-0.15.2.tar.xz" -o zig.tar.xz
tar xf zig.tar.xz
This will create the directory $HOME/zig-x86_64-linux-0.15.2. You only need to do this once.
If you are on macOS curl from https://ziglang.org/download/0.15.2/zig-x86_64-macos-0.15.2.tar.xz or https://ziglang.org/download/0.15.2/zig-aarch64-macos-0.15.2.tar.xz.
Change to your MCP Toolbox directory and run the following:
cd $HOME/mcp-toolbox
GOOS=windows \
GOARCH=amd64 \
CGO_ENABLED=1 \
CC="$HOME/zig-x86_64-linux-0.15.2/zig cc -target x86_64-windows-gnu" \
CXX="$HOME/zig-x86_64-linux-0.15.2/zig c++ -target x86_64-windows-gnu" \
go build -o toolbox.exe
If you are on macOS alter the path zig-x86_64-linux-0.15.2 to the proper path
for your zig installation.
Now the toolbox.exe file is ready to use. Transfer it to your windows machine and test it.
This section details the purpose and conventions for MCP Toolbox's tools naming properties, tool name and tool type.
kind: tool
name: cancel_hotel <- tool name
type: postgres-sql <- tool type
source: my_pg_source
Tool name is the identifier used by a Large Language Model (LLM) to invoke a specific tool.
The following guidelines apply to tool names:
list_collections instead of
list-collections).list_collections instead
of firestore_list_collections).Tool type serves as a category or type that a user can assign to a tool.
The following guidelines apply to tool types:
firestore-list-collections or
firestore_list_colelctions).firestore-list-collections over
list-collections).To align with the Model Context Protocol (MCP) and ensure robust agentic workflows, Toolbox distinguishes between errors the agent can fix and errors that require developer intervention.
When implementing Invoke() or ParseParams(), you must return the appropriate error type from internal/util/errors.go. This allows the LLM to attempt a "self-correct" for Agent Errors while signaling a hard stop for Server Errors.
| Category | Description | HTTP Status | MCP Result |
|---|---|---|---|
Agent Error (AgentError) | Input/Execution logic errors (e.g., SQL syntax, missing records, invalid params). The agent can fix this. | 200 OK | isError: true |
Server Error (ClientServerError) | Infrastructure failures (e.g., DB down, auth failure, network failure). The agent cannot fix this. | 500 Internal Error | JSON-RPC Error |
Use Typed Errors: Refactor or implement the Tool interface methods to return util.ToolboxError.
In Invoke():
AgentError.ClientServerError.In ParseParams():
ToolboxError for missing required parameters or wrong types.ClientServerError for failures in resolving authenticated parameters (e.g., invalid tokens).Example:
func (t *MyTool) Invoke(ctx context.Context, sp tools.SourceProvider, params parameters.ParamValues, token tools.AccessToken) (any, util.ToolboxError) { res, err := t.db.Exec(ctx, params.SQL) if err != nil { // Driver error is likely a syntax issue the LLM can fix return nil, util.NewAgentError("error executing SQL query", err) } return res, nil }
Please create an issue before implementation to ensure we can accept the contribution and no duplicated work. This issue should include an overview of the API design. If you have any questions, reach out on our Discord to chat directly with the team.
[!NOTE] New tools can be added for pre-existing data sources. However, any new database source should also include at least one new tool type.
We recommend looking at an example source implementation.
internal/sources for your database type
(e.g., internal/sources/newdb).newdb.go. Create a Config struct to include all the necessary parameters
for connecting to the database (e.g., host, port, username, password, database
name) and a Source struct to store necessary parameters for tools (e.g.,
Name, Type, connection object, additional config).SourceConfig
interface. This interface requires two methods:
SourceConfigType() string: Returns a unique string identifier for your
data source (e.g., "newdb").Initialize(ctx context.Context, tracer trace.Tracer) (Source, error):
Creates a new instance of your data source and establishes a connection to
the database.Source
interface. This interface requires one method:
SourceType() string: Returns the same string identifier as SourceConfigType().init() to register the new Source.newdb_test.go.[!NOTE] Please follow the tool naming convention detailed here.
We recommend looking at an example tool implementation.
Remember to keep your PRs small. For example, if you are contributing a new Source, only include one or two core Tools within the same PR, the rest of the Tools can come in subsequent PRs.
internal/tools for your tool type (e.g., internal/tools/newdb/newdbtool).newdbtool.go.
Create a Config struct and a Tool struct to store necessary parameters for
tools.ToolConfig
interface. This interface requires one method:
ToolConfigType() string: Returns a unique string identifier for your tool
(e.g., "newdb-tool").Initialize(sources map[string]Source) (Tool, error): Creates a new
instance of your tool and validates that it can connect to the specified
data source.Tool interface. This interface requires the following
methods:
Invoke(ctx context.Context, params map[string]any) ([]any, error):
Executes the operation on the database using the provided parameters.ParseParams(data map[string]any, claims map[string]map[string]any) (ParamValues, error): Parses and validates the input parameters.Manifest() Manifest: Returns a manifest describing the tool's capabilities
and parameters.McpManifest() McpManifest: Returns an MCP manifest describing the tool for
use with the Model Context Protocol.Authorized(services []string) bool: Checks if the tool is authorized to
run based on the provided authentication services.init() to register the new Tool.newdbtool_test.go.Add a test file under a new directory tests/newdb.
Add pre-defined integration test suites in the
/tests/newdb/newdb_integration_test.go that are required to be run as
long as your code contains related features. Please check each test suites for
the config defaults, if your source require test suites config updates, please
refer to config option:
RunToolGetTest: tests for the GET endpoint that returns the
tool's manifest.
RunToolInvokeTest: tests for tool calling through the native Toolbox endpoints.
RunMCPToolCallMethod: tests tool calling through the MCP endpoints.
(Optional) RunExecuteSqlToolInvokeTest: tests an
execute-sql tool for any source. Only run this test if you are adding an
execute-sql tool.
(Optional) RunToolInvokeWithTemplateParameters: tests for template parameters. Only run this test if template parameters apply to your tool.
Add additional tests for the tools that are not covered by the predefined tests. Every tool must be tested!
Add the new database to the integration test workflow in integration.cloudbuild.yaml.
When updating documentation, you must adhere to the structural constraints enforced by our Diátaxis-based layout and internal linters:
docs/en/integrations/ directory (e.g., docs/en/integrations/newdb/)._index.md file. This acts purely as a structural folder wrapper for Hugo. Do not add body content here.source.md file. This is the definitive guide. Add all connection details, authentication, and YAML configurations here. Ensure you include the {{< list-tools >}} shortcode to dynamically display tools.tools/ directory inside your source (e.g., docs/en/integrations/newdb/tools/)._index.md file inside the tools/ directory. It must contain only frontmatter and absolutely no markdown body text.<tool_name>.md file in this new tools/ folder. Ensure you include the {{< compatible-sources >}} shortcode.tools/ directory with an _index.md file.shared_tools YAML array to the frontmatter of this tools/_index.md file. This file must strictly contain only frontmatter.docs/en/documentation/getting-started/quickstart/.docs/en/integrations/<db>/samples/. Must include an _index.md with strictly only frontmatter.docs/en/samples/.is_sample: true - Required for indexing.sample_filters: - A YAML array used for UI filtering (e.g., [postgres, go, sql]).integrations, documentation), you must update the AI documentation layout files located at .hugo/layouts/index.llms.txt and .hugo/layouts/index.llms-full.txt. Specifically, update the "Diátaxis Narrative Framework" preamble so AI models understand the purpose of your new section.You can provide developers with a set of "build-time" tools to aid common software development user journeys like viewing and creating tables/collections and data.
tools.yaml and adding
it to internal/tools. Make sure the file name matches the source (i.e. for
source "alloydb-postgres" create a file named "alloydb-postgres.yaml").cmd/root.go to add new source to the prebuilt flag.Toolbox uses both GitHub Actions and Cloud Build to run test workflows. Cloud Build is used when Google credentials are required. Cloud Build uses test project "toolbox-testing-438616".
Run the lint check to ensure code quality:
golangci-lint run --fix
To ensure consistency, we enforce a standardized structure for integration Source and Tool pages.
Before pushing changes to integration pages:
Run the source page linter to validate:
# From the repository root
./.ci/lint-docs-source-page.sh
Run the tool page linter to validate:
# From the repository root
./.ci/lint-docs-tool-page.sh
Execute unit tests locally:
go test -race -v ./cmd/... ./internal/...
Environment Variables: Set the required environment variables. Refer to the Cloud Build testing configuration for a complete list of variables for each source.
SERVICE_ACCOUNT_EMAIL: Use your own GCP email.CLIENT_ID: Use the Google Cloud SDK application Client ID. Contact
Toolbox maintainers if you don't have it.Running Tests: Run the integration test for your target source. Specify the required Go build tags at the top of each integration test file.
go test -race -v ./tests/<YOUR_TEST_DIR>
For example, to run the AlloyDB integration test:
go test -race -v ./tests/alloydbpg
Timeout: The integration test should have a timeout on the server. Look for code like this:
ctx, cancel := context.WithTimeout(context.Background(), time.Minute)
defer cancel()
cmd, cleanup, err := tests.StartCmd(ctx, toolsFile, args...)
if err != nil {
t.Fatalf("command initialization returned an error: %s", err)
}
defer cleanup()
Be sure to set the timeout to a reasonable value for your tests.
/gcbrun to execute the integration tests.tests:run to execute the unit tests.docs: deploy-preview to run the PR Preview workflow.The following databases have been added as test resources. To add a new database to test against, please contact the Toolbox maintainer team via an issue or PR. Refer to the Cloud Build testing configuration for a complete list of variables for each source.
alloydb-ai-nl tool testsWe use lychee for repository link checks.
Update the Link: Correct the broken URL or update the content where it is used.
Ignore the Link: If you can't fix the link (e.g., due to external rate-limits or if it's a local-only URL), tell Lychee to ignore it.
.lycheeignore:
# These are email addresses, not standard web URLs, and usually cause check failures.
^mailto:.*
[!NOTE] To avoid build failures in GitHub Actions, follow the linking pattern demonstrated here:
Avoid: (Works in Hugo, breaks Link Checker):
[Read more](docs/setup)or[Read more](docs/setup/)
Reason: The link checker cannot find a file named "setup" or a directory with that name containing an index.
Preferred:
[Read more](docs/setup.md)
Reason: The GitHub Action finds the physical file. Hugo then uses its internal logic (or render hooks) to resolve this to the correct
/docs/setup/web URL.
.github/header-checker-lint.yml) - Ensures files have
the appropriate licenseTo maintain consistency and prevent repository bloat, all pull requests must pass the automated documentation linters.
integrations/**/source.md)When adding or updating a Source page, your markdown file must strictly adhere to the following architectural rules:
source.md. (Note: _index.md files are purely structural folder wrappers. Do not add body content to them).Source always.title field must end with the word "Source" (e.g., title: "Firestore Source").#) tags in the markdown body. The page title is automatically generated from the frontmatter.##) headings in a strict, specific order.
About, Example, ReferenceAvailable Tools, Requirements, Advanced Usage, Troubleshooting, Additional Resources## Available Tools heading, you must place the list-tools shortcode (e.g., {{< list-tools >}}) directly beneath it.integrations/**/tools/*.md)When adding or updating a Tool page, your markdown file must strictly adhere to the following architectural rules:
tools/ directory.#) tags in the markdown body. The page title is automatically generated from the frontmatter.##) headings in a strict, specific order.
About, ExampleCompatible Sources, Requirements, Parameters, Output Format, Reference, Advanced Usage, Troubleshooting, Additional Resources## Compatible Sources heading, you must place the compatible-sources shortcode (e.g., {{< compatible-sources >}}) directly beneath it.integrations/**/prebuilt-configs/*.md)To ensure new prebuilt configurations are automatically indexed by the {{< list-prebuilt-configs >}} shortcode on the main Prebuilt Configs page, follow these rules:
prebuilt-configs/ inside the database folder (e.g., docs/en/integrations/alloydb/prebuilt-configs/).prebuilt-configs/ directory must contain an _index.md file. This file acts as the anchor for the directory and must contain the title and description used in the automated lists.kind defined in the tool's YAML file (in internal/prebuiltconfigs/tools/). For example, any tool using the postgres kind should live in the postgres/ integration directory.If you need to modify the visual appearance, navigation, or behavior of the documentation website itself, all frontend assets are isolated within the .hugo/ directory.
docs/ directory may exceed 24MB. This prevents repository bloat and ensures fast clone times. If you need to include large assets (like high-resolution videos or massive PDFs), host them externally and link to them in the markdown.Follow these steps to preview documentation changes locally using a Hugo server:
Install Hugo: Ensure you have Hugo extended edition version 0.146.0 or later installed.
Navigate to the Hugo Directory:
cd .hugo
Install Dependencies:
npm ci
Generate Search Index & Start the Server: Because the Pagefind search engine requires physical files to build its index, hugo server (which runs purely in memory) will not display search results by default. To test the search bar locally, build the physical site once (using the development environment to avoid triggering production analytics), generate the index into the static folder, and then start the server:
hugo --environment development
npx pagefind --site public --output-path static/pagefind
hugo server
(Note: The static/pagefind/ directory is git-ignored to prevent committing local search indexes).
Documentation preview links are automatically generated and commented on your pull request when working from a branch within the main repository.
For external contributors (forks): For security reasons, automated deployment previews are disabled for pull requests originating from external forks for the cloudflare deployments. To review your documentation changes, please follow the Running a Local Hugo Server instructions to build and view the site on your local machine before requesting a review.
The documentation uses a dynamic versioning system that outputs standard HTML sites alongside AI-optimized plain text files (llms.txt and llms-full.txt).
Search Indexing: All deployment workflows automatically execute npx pagefind --site public to generate a version-scoped search index specific to that deployment's base URL.
There are 3 GHA workflows we use to achieve document versioning:
Deploy In-development docs:
This workflow is run on every commit merged into the main branch. It deploys
the built site to the /dev/ subdirectory for the in-development
documentation.
Deploy Versioned Docs: When a new GitHub Release is published, it performs two deployments based on the new release tag. One to the new version subdirectory and one to the root directory of the cloudflare-pages branch.
Note: Before the release PR from release-please is merged, add the newest version into the hugo.toml file.
Deploy Previous Version Docs: This is a manual workflow, started from the GitHub Actions UI. To rebuild and redeploy documentation for an already released version that were released before this new system was in place. This workflow can be started on the UI by providing the git version tag which you want to create the documentation for. The specific versioned subdirectory and the root docs are updated on the cloudflare-pages branch.
Request a repo owner to run the preview deployment workflow on your PR. A preview link will be automatically added as a comment to your PR.
.github/workflows/ directory.docs: deploy-preview label to the PR to
deploy a documentation preview.This repository includes custom shortcodes to help with documentation consistency and maintenance. For more information on how they work, see the Hugo Shortcodes documentation and the guide to creating custom shortcodes.
include ShortcodeThe include shortcode reads a file and optionally fences it with a language.
Syntax:
{{< include "path/to/file" "language" >}}
Example:
{{< include "static/headers/license_header.txt" >}}
{{< include "samples/program.js" "javascript" >}}
Source: .hugo/layouts/shortcodes/include.html
regionInclude ShortcodeThe regionInclude shortcode reads a file, extracts content between [START region_name] and [END region_name], and optionally fences it.
Syntax:
{{< regionInclude "path/to/file" "region_name" "language" >}}
Example Markdown:
{{< regionInclude "samples/program.js" "program_setup" "javascript" >}}
Example Code Snippet (samples/program.js):
// [START program_setup]
import { Toolbox } from '@googleapis/mcp-toolbox';
const toolbox = new Toolbox();
// [END program_setup]
Source: .hugo/layouts/shortcodes/regionInclude.html
Build Command: Compile the Toolbox binary:
go build -o toolbox
Running the Binary: Execute the compiled binary with optional flags. The server listens on port 5000 by default:
./toolbox
Testing the Endpoint: Verify the server is running by sending a request to the endpoint:
curl http://127.0.0.1:5000
Build Command: Build the Toolbox container image:
docker build -t toolbox:dev .
View Image: List available Docker images to confirm the build:
docker images
Run Container: Run the Toolbox container image using Docker:
docker run -d toolbox:dev
Refer to the SDK developer guide for instructions on developing Toolbox SDKs.
Team @googleapis/senseai-eco has been set as
CODEOWNERS. The GitHub TeamSync tool is used to create
this team from MDB Group, senseai-eco. Additionally, database-specific GitHub
teams (e.g., @googleapis/toolbox-alloydb) have been created from MDB groups to
manage code ownership and review for individual database products.
After an issue is created, maintainers will assign the following labels:
Priority (defaulted to P0)Type (if applicable)Product (if applicable)All incoming issues and PRs will follow the following SLO:
| Type | Priority | Objective |
|---|---|---|
| Feature Request | P0 | Must respond within 5 days |
| Process | P0 | Must respond within 5 days |
| Bugs | P0 | Must respond within 5 days, and resolve/closure within 14 days |
| Bugs | P1 | Must respond within 7 days, and resolve/closure within 90 days |
| Bugs | P2 | Must respond within 30 days |
Types that are not listed in the table do not adhere to any SLO.
Toolbox has two types of releases: versioned and continuous. It uses Google
Cloud project, database-toolbox.
latest.
The release process is defined in
versioned.release.cloudbuild.yaml..github/release-please.yml automatically creates GitHub
Releases and release PRs.git commit -m "chore: release 0.1.0" -m "Release-As: 0.1.0" --allow-emptyRun the following command (from the root directory):
export VERSION="v0.0.0"
.ci/generate_release_table.sh
Copy the table output
In the GitHub UI, navigate to Releases and click the edit button.
Paste the table at the bottom of release note and click Update release.
The following operating systems and architectures are supported for binary releases:
The following base container images are supported for container image releases:
Integration and unit tests are automatically triggered via Cloud Build on each pull request. Integration tests run on merge and nightly.
On-merge and nightly tests that fail have notification setup via Cloud Build Failure Reporter GitHub Actions Workflow.
Configure a Cloud Build trigger using the UI or gcloud with the following
settings:
^main$Trigger pull request tests for external contributors by:
/gcbruntests:run label