apps/mantine.dev/src/pages/guides/llms.mdx
import { Layout } from '@/layout'; import { MDX_DATA } from '@/mdx';
export default Layout(MDX_DATA.LLMDocumentation);
Mantine provides LLM-friendly documentation to help AI tools like Cursor, Windsurf, GitHub Copilot, ChatGPT, and Claude understand and work with the Mantine UI library.
llms.txt documentation is updated with every Mantine release.
Links:
The LLM documentation includes:
In Cursor, you can reference the documentation using the @Docs feature:
@Docs in your prompthttps://mantine.dev/llms.txtFor Windsurf users:
@https://mantine.dev/llms.txt.windsurfrules file for persistent accessWhen using ChatGPT or Claude:
https://mantine.dev/llms.txtWhile Copilot doesn't directly support external documentation, you can:
Mantine also provides skills for AI coding agents in the
mantinedev/skills repository.
Currently available skills:
mantine-combobox – Build custom select/autocomplete/multiselect components with Comboboxmantine-form – Build forms with @mantine/form, validation, nested fields, and form contextmantine-custom-components – Create custom components with Mantine factory APIs and Styles APIInstall each skill from the repository:
npx skills add https://github.com/mantinedev/skills --skill mantine-combobox
npx skills add https://github.com/mantinedev/skills --skill mantine-form
npx skills add https://github.com/mantinedev/skills --skill mantine-custom-components
In your AI prompt, explicitly tell the agent to use one of the installed skills.
Examples:
$mantine-form and build a profile form with validation and nested fields"$mantine-combobox and create a searchable multi-select with custom option rendering"$mantine-custom-components and scaffold a polymorphic component with Styles API support"If your agent does not support $skill-name mentions, reference the skill name in plain text and ask the agent to follow it.
Mantine also provides an MCP server package:
@mantine/mcp-serverThe server reads Mantine static MCP data published on mantine.dev and exposes tools that AI agents can call directly:
list_itemsget_item_docget_item_propssearch_docsMost MCP-compatible tools support adding servers with a JSON configuration. Use this server definition:
{
"mcpServers": {
"mantine": {
"command": "npx",
"args": ["-y", "@mantine/mcp-server"]
}
}
}
To use a different data source (for example, alpha docs or local static files), add env variables:
{
"mcpServers": {
"mantine": {
"command": "npx",
"args": ["-y", "@mantine/mcp-server"],
"env": {
"MANTINE_MCP_DATA_URL": "https://mantine.dev/mcp"
}
}
}
}
mantine server configuration abovemantine server config@mantine/mcp-server with the same configIf the client supports custom MCP servers, add the same command and args:
npx["-y", "@mantine/mcp-server"]Then use prompts like:
Here are some example prompts you can use with AI tools:
The LLM documentation is automatically generated from our source files using a compilation script. It includes:
There are two generated formats:
llms.txt – the default compact index that links to per-page .md files under the /llms pathllms-full.txt – a single large file with all documentation contentTo ensure you have the latest documentation, we regenerate these files with each release. The files follow the LLMs.txt standard for better compatibility with AI tools.
If you find any issues with the LLM documentation or have suggestions for improvement, please open an issue on our GitHub repository.