docs/mcp/mcp-overview.mdx
Model Context Protocol is an open protocol that standardizes how applications provide context to LLMs. Think of MCP like a USB-C port for AI applications; it provides a standardized way to connect AI models to different data sources and tools. MCP servers act as intermediaries between large language models (LLMs), such as Claude, and external tools or data sources. They are small programs that expose functionalities to LLMs, enabling them to interact with the outside world through the MCP. An MCP server is essentially like an API that an LLM can use.
<Frame> </Frame>MCP servers define a set of "tools," which are functions the LLM can execute. These tools offer a wide range of capabilities.
Here's how MCP works:
The potential of MCP servers is vast. They can be used for a variety of purposes.
Here are some concrete examples of how MCP servers can be used:
Cline does not come with any pre-installed MCP servers. You'll need to find and install them separately.
Choose the right approach for your needs:
MCP servers work with both the Cline VS Code extension and the Cline CLI. If you use the CLI, see MCP Server Configuration for the CLI to get set up.
Cline simplifies the building and use of MCP servers through its AI capabilities.
When working with MCP servers, it's important to follow security best practices:
There are various resources available for finding and learning about MCP servers.
Here are some links to resources for finding and learning about MCP servers: