MEMORY_TEMPLATE_UPDATE.md
The n8n-mcp project maintains a database of workflow templates from n8n.io. This guide explains how to update the template database incrementally without rebuilding from scratch.
As of the last update:
# Build if needed
npm run build
# Fetch only NEW templates (5-10 minutes)
npm run fetch:templates:update
# Rebuild entire database from scratch (30-40 minutes)
npm run fetch:templates
--update)The incremental update is smart and efficient:
✅ Non-destructive: All existing templates preserved ✅ Fast: Only fetches new templates (5-10 min vs 30-40 min) ✅ API friendly: Reduces load on n8n.io API ✅ Safe: Preserves AI-generated metadata ✅ Smart: Automatically skips duplicates
| Mode | Templates Fetched | Time | Use Case |
|---|---|---|---|
| Update | Only new (~50-200) | 5-10 min | Regular updates |
| Rebuild | All (~8000+) | 30-40 min | Initial setup or corruption |
npm run fetch:templates:update
npm run fetch:templates
# Update templates and generate AI metadata
npm run fetch:templates -- --update --generate-metadata
# Or just generate metadata for existing templates
npm run fetch:templates -- --metadata-only
npm run fetch:templates -- --help
Recommended update schedule:
The fetcher automatically filters templates:
# 1. Check current state
sqlite3 data/nodes.db "SELECT COUNT(*) FROM templates"
# 2. Build project (if code changed)
npm run build
# 3. Run incremental update
npm run fetch:templates:update
# 4. Verify new templates added
sqlite3 data/nodes.db "SELECT COUNT(*) FROM templates"
When you update n8n dependencies, templates remain compatible:
# 1. Update n8n (from MEMORY_N8N_UPDATE.md)
npm run update:all
# 2. Fetch new templates incrementally
npm run fetch:templates:update
# 3. Check how many templates were added
sqlite3 data/nodes.db "SELECT COUNT(*) FROM templates"
# 4. Generate AI metadata for new templates (optional, requires OPENAI_API_KEY)
npm run fetch:templates -- --metadata-only
# 5. IMPORTANT: Sanitize templates before pushing database
npm run build
npm run sanitize:templates
Templates are independent of n8n version - they're just workflow JSON data.
CRITICAL: Always run npm run sanitize:templates before pushing the database to remove API tokens from template workflows.
Note: New templates fetched via --update mode will NOT have AI-generated metadata by default. You need to run --metadata-only separately to generate metadata for templates that don't have it yet.
This is normal! It means:
📊 Update mode: 0 new templates to fetch (skipping 2598 existing)
✅ All templates already have metadata
If you hit rate limits:
--update mode instead of full rebuildIf you suspect corruption:
# Full rebuild from scratch
npm run fetch:templates
# This will:
# - Drop and recreate templates table
# - Fetch all templates fresh
# - Rebuild search indexes
Templates are stored with:
Generate AI metadata for templates:
# Requires OPENAI_API_KEY in .env
export OPENAI_API_KEY="sk-..."
# Generate for templates without metadata (recommended after incremental update)
npm run fetch:templates -- --metadata-only
# Generate during template fetch (slower, but automatic)
npm run fetch:templates:update -- --generate-metadata
Important: Incremental updates (--update) do NOT generate metadata by default. After running npm run fetch:templates:update, you'll have new templates without metadata. Run --metadata-only separately to generate metadata for them.
# See how many templates have metadata
sqlite3 data/nodes.db "SELECT
COUNT(*) as total,
SUM(CASE WHEN metadata_json IS NOT NULL THEN 1 ELSE 0 END) as with_metadata,
SUM(CASE WHEN metadata_json IS NULL THEN 1 ELSE 0 END) as without_metadata
FROM templates"
# See recent templates without metadata
sqlite3 data/nodes.db "SELECT id, name, created_at
FROM templates
WHERE metadata_json IS NULL
ORDER BY created_at DESC
LIMIT 10"
Metadata includes:
If metadata generation fails:
temp/batch/batch_*_error.jsonl"Unsupported value: 'temperature'" - Model doesn't support custom temperature"Invalid request" - Check OPENAI_API_KEY is validgpt-5-mini-2025-08-07 by defaultThe system will automatically:
Example error handling:
# If you see: "No output file available for batch job"
# Check: temp/batch/batch_*_error.jsonl for error details
# The system now automatically processes errors and generates default metadata
Optional configuration:
# OpenAI for metadata generation
OPENAI_API_KEY=sk-...
OPENAI_MODEL=gpt-4o-mini # Default model
OPENAI_BATCH_SIZE=50 # Batch size for metadata generation
# Metadata generation limits
METADATA_LIMIT=100 # Max templates to process (0 = all)
After update, check stats:
# Template count
sqlite3 data/nodes.db "SELECT COUNT(*) FROM templates"
# Most recent template
sqlite3 data/nodes.db "SELECT MAX(created_at) FROM templates"
# Templates by view count
sqlite3 data/nodes.db "SELECT COUNT(*),
CASE
WHEN views < 50 THEN '<50'
WHEN views < 100 THEN '50-100'
WHEN views < 500 THEN '100-500'
ELSE '500+'
END as view_range
FROM templates GROUP BY view_range"
Templates are available through MCP tools:
list_templates: List all templatesget_template: Get specific template with workflowsearch_templates: Search by keywordlist_node_templates: Templates using specific nodesget_templates_for_task: Templates for common taskssearch_templates_by_metadata: Advanced filteringSee npm run test:templates for usage examples.
Typical incremental update:
Full rebuild:
After updating templates:
npm run test:templatesMEMORY_N8N_UPDATE.md - Updating n8n dependenciesCLAUDE.md - Project overview and architectureREADME.md - User documentation