apps/docs/getting-started/api-keys.mdx
Task Master supports multiple AI providers through environment variables. This page lists all available API keys and their configuration requirements.
Note: At least one required API key must be configured for Task Master to function.
"Required: Yes" below means "required to use that specific provider," not "required globally." You only need at least one provider configured.
sk-ant-api03-...ANTHROPIC_API_KEY="sk-ant-api03-your-key-here"
pplx-...PERPLEXITY_API_KEY="pplx-your-key-here"
sk-proj-... or sk-...OPENAI_API_KEY="sk-proj-your-key-here"
GOOGLE_APPLICATION_CREDENTIALS for service account (Google Vertex)GOOGLE_API_KEY="your-google-api-key-here"
GROQ_API_KEY="your-groq-key-here"
OPENROUTER_API_KEY="your-openrouter-key-here"
AZURE_OPENAI_ENDPOINT configurationAZURE_OPENAI_API_KEY="your-azure-key-here"
XAI_API_KEY="your-xai-key-here"
Note: These API keys are optional - providers will work without them or use alternative authentication methods.
# Optional - AWS credential chain is preferred
AWS_ACCESS_KEY_ID="your-aws-access-key"
AWS_SECRET_ACCESS_KEY="your-aws-secret-key"
# Not typically needed
CLAUDE_CODE_API_KEY="not-usually-required"
# Optional - OAuth via CLI is preferred
GEMINI_API_KEY="your-gemini-key-here"
# Optional - CLI config is preferred
GROK_CLI_API_KEY="your-grok-cli-key"
# Only needed for remote Ollama servers
OLLAMA_API_KEY="your-ollama-api-key-here"
ghp_... or github_pat_...GITHUB_API_KEY="ghp-your-github-key-here"
Create a .env file in your project root:
# Copy from .env.example
cp .env.example .env
# Edit with your keys
vim .env
export ANTHROPIC_API_KEY="your-key-here"
export PERPLEXITY_API_KEY="your-key-here"
# ... other keys
For Claude Code integration, configure keys in .mcp.json:
{
"mcpServers": {
"task-master-ai": {
"command": "npx",
"args": ["-y", "task-master-ai"],
"env": {
"ANTHROPIC_API_KEY": "your-key-here",
"PERPLEXITY_API_KEY": "your-key-here",
"OPENAI_API_KEY": "your-key-here"
}
}
}
}
AZURE_OPENAI_API_KEY and AZURE_OPENAI_ENDPOINT configurationVERTEX_PROJECT_ID and VERTEX_LOCATION environment variablesAfter setting up API keys, configure which models to use:
# Interactive model setup
task-master models --setup
# Set specific models
task-master models --set-main claude-3-5-sonnet-20241022
task-master models --set-research perplexity-llama-3.1-sonar-large-128k-online
task-master models --set-fallback gpt-4o-mini
.gitignore# Check if keys are properly configured
task-master models
# Test specific provider
task-master add-task --prompt="test task" --model=claude-3-5-sonnet-20241022
If you encounter issues with API key configuration: