gpt_researcher/mcp/README.md
This directory contains the comprehensive Model Context Protocol (MCP) integration for GPT Researcher. MCP enables GPT Researcher to seamlessly connect with and utilize external tools and data sources through a standardized protocol.
Model Context Protocol (MCP) is an open standard that enables secure connections between AI applications and external data sources and tools. With MCP, GPT Researcher can:
gpt_researcher/mcp/
āāā __init__.py # Module initialization and imports
āāā client.py # MCP client management and configuration
āāā tool_selector.py # Intelligent tool selection using LLM
āāā research.py # Research execution with selected tools
āāā streaming.py # WebSocket streaming and logging utilities
āāā README.md # This documentation
client.py - MCPClientManagerHandles MCP server connections and client lifecycle:
tool_selector.py - MCPToolSelectorIntelligent tool selection using LLM analysis:
research.py - MCPResearchSkillExecutes research using selected MCP tools:
streaming.py - MCPStreamerReal-time streaming and logging:
Install MCP Dependencies:
pip install langchain-mcp-adapters
Setup MCP Server: You need at least one MCP server to connect to. This could be:
from gpt_researcher import GPTResearcher
# MCP configuration for a local server
mcp_configs = [{
"command": "python",
"args": ["my_mcp_server.py"],
"name": "local_server",
"tool_name": "search" # Optional: specify specific tool
}]
# Initialize researcher with MCP
researcher = GPTResearcher(
query="What are the latest developments in AI?",
mcp_configs=mcp_configs
)
# Conduct research using MCP tools
context = await researcher.conduct_research()
report = await researcher.write_report()
# WebSocket MCP server
mcp_configs = [{
"connection_url": "ws://localhost:8080/mcp",
"connection_type": "websocket",
"name": "websocket_server"
}]
# HTTP MCP server
mcp_configs = [{
"connection_url": "https://api.example.com/mcp",
"connection_type": "http",
"connection_token": "your-auth-token",
"name": "http_server"
}]
mcp_configs = [
{
"command": "python",
"args": ["database_server.py"],
"name": "database",
"env": {"DB_HOST": "localhost"}
},
{
"connection_url": "ws://localhost:8080/search",
"name": "search_service"
},
{
"connection_url": "https://api.knowledge.com/mcp",
"connection_token": "token123",
"name": "knowledge_base"
}
]
Each MCP server configuration supports the following options:
| Field | Type | Description | Example |
|---|---|---|---|
name | str | Unique name for the server | "my_server" |
command | str | Command to start stdio server | "python" |
args | list[str] | Arguments for the command | ["server.py", "--port", "8080"] |
connection_url | str | URL for websocket/HTTP connection | "ws://localhost:8080/mcp" |
connection_type | str | Connection type | "stdio", "websocket", "http" |
connection_token | str | Authentication token | "your-token" |
tool_name | str | Specific tool to use (optional) | "search" |
env | dict | Environment variables | {"API_KEY": "secret"} |
The MCP client automatically detects connection types:
ws:// or wss:// ā WebSockethttp:// or https:// ā HTTP__init__.py for easy importingTo customize tool selection logic, extend MCPToolSelector:
from gpt_researcher.mcp import MCPToolSelector
class CustomToolSelector(MCPToolSelector):
def _fallback_tool_selection(self, all_tools, max_tools):
# Custom fallback logic
return super()._fallback_tool_selection(all_tools, max_tools)
Extend MCPResearchSkill for custom result processing:
from gpt_researcher.mcp import MCPResearchSkill
class CustomResearchSkill(MCPResearchSkill):
def _process_tool_result(self, tool_name, result):
# Custom result processing
return super()._process_tool_result(tool_name, result)
Import Error: langchain-mcp-adapters not installed
pip install langchain-mcp-adapters
Connection Failed: Check server URL and authentication
No Tools Available: Server may not be exposing tools
Tool Selection Issues: LLM may not select appropriate tools
Enable debug logging for detailed information:
import logging
logging.getLogger('gpt_researcher.mcp').setLevel(logging.DEBUG)
Contributions to the MCP integration are welcome! Please:
This MCP integration brings powerful extensibility to GPT Researcher, enabling connections to virtually any data source or tool through the standardized MCP protocol. š