mcp-server/README.md
Note: This content has been moved to a dedicated repository: https://github.com/assafelovic/gptr-mcp
The GPT Researcher MCP Server enables AI assistants like Claude to conduct comprehensive web research and generate detailed reports via the Machine Conversation Protocol (MCP).
While LLM apps can access web search tools with MCP, GPT Researcher MCP delivers deep research results. Standard search tools return raw results requiring manual filtering, often containing irrelevant sources and wasting context window space.
GPT Researcher autonomously explores and validates numerous sources, focusing only on relevant, trusted and up-to-date information. Though slightly slower than standard search (~30 seconds wait), it delivers:
research_resource: Get web resources related to a given task via research.deep_research: Performs deep web research on a topic, finding reliable and relevant informationquick_search: Performs a fast web search optimized for speed over qualitywrite_report: Generate a report based on research resultsget_research_sources: Get the sources used in the researchget_research_context: Get the full context of the researchFor detailed installation and usage instructions, please visit the official repository.
Quick start:
Clone the new repository:
git clone https://github.com/assafelovic/gptr-mcp.git
cd gptr-mcp
Install dependencies:
pip install -r requirements.txt
Create a .env file with your API keys:
OPENAI_API_KEY=your_openai_api_key
TAVILY_API_KEY=your_tavily_api_key
Run the server:
python server.py
For Docker deployment, Claude Desktop integration, example usage, and troubleshooting, please refer to the full documentation.