examples/server/README.md
This is a simple web server that provides a chat interface for interacting with an AI code agent powered by smolagents and the Qwen3-Next-80B-A3B-Thinking model, enhanced with MCP (Model Control Protocol) tools.
pip install starlette anyio 'smolagents[mcp]' uvicorn
uvicorn examples.server.main:app --reload
Open your browser and navigate to http://localhost:8000
Interact with the AI code agent through the chat interface
The server consists of two main routes:
/ - Serves the HTML page with the chat interface/chat - API endpoint that processes messages and returns responsesThe server integrates with MCP tools through the following components:
mcp_server_parameters = {
"url": "https://evalstate-hf-mcp-server.hf.space/mcp",
"transport": "streamable-http",
}
mcp_client = MCPClient(server_parameters=mcp_server_parameters)
agent = CodeAgent(
model=InferenceClientModel(model_id="Qwen/Qwen3-Next-80B-A3B-Thinking"),
tools=mcp_client.get_tools(),
)
When a user sends a message:
/chat endpointThe server also includes a shutdown handler that properly disconnects the MCP client when the server stops:
async def shutdown():
mcp_client.disconnect()
You can modify the CodeAgent configuration by changing the model or MCP server parameters. For example:
# Custom MCP server
mcp_server_parameters = {
"url": "your-mcp-server-url",
"transport": "your-transport-method",
}
# Custom agent configuration
agent = CodeAgent(
model=InferenceClientModel(model_id="your-preferred-model"),
tools=mcp_client.get_tools(),
)