cookbook/ai_coding_tool_guides/claude_code_quickstart/guide.md
This guide shows how to call Claude models (and any LiteLLM-supported model) through LiteLLM proxy from Claude Code.
Note: This integration is based on Anthropic's official LiteLLM configuration documentation. It allows you to use any LiteLLM supported model through Claude Code with centralized authentication, usage tracking, and cost controls.
Watch the full tutorial: https://www.loom.com/embed/3c17d683cdb74d36a3698763cc558f56
First, install LiteLLM with proxy support:
pip install 'litellm[proxy]'
Create a secure configuration using environment variables:
model_list:
# Claude models
- model_name: claude-3-5-sonnet-20241022
litellm_params:
model: anthropic/claude-3-5-sonnet-20241022
api_key: os.environ/ANTHROPIC_API_KEY
- model_name: claude-3-5-haiku-20241022
litellm_params:
model: anthropic/claude-3-5-haiku-20241022
api_key: os.environ/ANTHROPIC_API_KEY
litellm_settings:
master_key: os.environ/LITELLM_MASTER_KEY
Set your environment variables:
export ANTHROPIC_API_KEY="your-anthropic-api-key"
export LITELLM_MASTER_KEY="sk-1234567890" # Generate a secure key
litellm --config /path/to/config.yaml
# RUNNING on http://0.0.0.0:4000
Test that your proxy is working correctly:
curl -X POST http://0.0.0.0:4000/v1/messages \
-H "Authorization: Bearer $LITELLM_MASTER_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "claude-3-5-sonnet-20241022",
"max_tokens": 1000,
"messages": [{"role": "user", "content": "What is the capital of France?"}]
}'
Configure Claude Code to use LiteLLM's unified endpoint. Either a virtual key or master key can be used here:
export ANTHROPIC_BASE_URL="http://0.0.0.0:4000"
export ANTHROPIC_AUTH_TOKEN="$LITELLM_MASTER_KEY"
Tip: LITELLM_MASTER_KEY gives Claude access to all proxy models, whereas a virtual key would be limited to the models set in the UI.
Alternatively, use the Anthropic pass-through endpoint:
export ANTHROPIC_BASE_URL="http://0.0.0.0:4000/anthropic"
export ANTHROPIC_AUTH_TOKEN="$LITELLM_MASTER_KEY"
You have two options for specifying which model Claude Code uses:
Specify the model directly when starting Claude Code or during a session:
# Specify model at startup
claude --model claude-3-5-sonnet-20241022
# Or change model during a session
/model claude-3-5-haiku-20241022
This method uses the exact model you specify.
Configure default models using environment variables:
# Tell Claude Code which models to use by default
export ANTHROPIC_DEFAULT_SONNET_MODEL=claude-3-5-sonnet-20241022
export ANTHROPIC_DEFAULT_HAIKU_MODEL=claude-3-5-haiku-20241022
export ANTHROPIC_DEFAULT_OPUS_MODEL=claude-opus-3-5-20240229
claude # Will use the models specified above
Note: Claude Code may cache the model from a previous session. If environment variables don't take effect, use Option 1 to explicitly set the model.
Important: The model_name in your LiteLLM config must match what Claude Code requests (either from env vars or command line).
Claude Code supports extended context (1 million tokens) using the [1m] suffix with Claude 4+ models:
# Use Sonnet 4.5 with 1M context (requires quotes for shell)
claude --model 'claude-sonnet-4-5-20250929[1m]'
# Inside a Claude Code session (no quotes needed)
/model claude-sonnet-4-5-20250929[1m]
Important: When using --model with [1m] in the shell, you must use quotes to prevent the shell from interpreting the brackets.
Alternatively, set as default with environment variables:
export ANTHROPIC_DEFAULT_SONNET_MODEL='claude-sonnet-4-5-20250929[1m]'
claude
How it works:
[1m] suffix before sending to LiteLLManthropic-beta: context-1m-2025-08-07[1m] in model namesVerify 1M context is active:
/context
# Should show: 21k/1000k tokens (2%)
Pricing: Models using 1M context have different pricing. Input tokens above 200k are charged at a higher rate.
Common issues and solutions:
Claude Code not connecting:
curl http://0.0.0.0:4000/healthANTHROPIC_BASE_URL is set correctlyANTHROPIC_AUTH_TOKEN matches your LiteLLM master keyAuthentication errors:
echo $LITELLM_MASTER_KEYANTHROPIC_AUTH_TOKEN matches your LiteLLM master keyModel not found:
config.yaml has a matching model_name entryecho $ANTHROPIC_DEFAULT_SONNET_MODEL1M context not working (showing 200k instead of 1000k):
[1m] suffix: /model your-model-name[1m]context-1m-2025-08-07 in the request[1m] in the model_nameYou can configure LiteLLM to route to any supported provider. Here's an example with multiple providers:
model_list:
# OpenAI models
- model_name: codex-mini
litellm_params:
model: openai/codex-mini
api_key: os.environ/OPENAI_API_KEY
api_base: https://api.openai.com/v1
- model_name: o3-pro
litellm_params:
model: openai/o3-pro
api_key: os.environ/OPENAI_API_KEY
api_base: https://api.openai.com/v1
- model_name: gpt-4o
litellm_params:
model: openai/gpt-4o
api_key: os.environ/OPENAI_API_KEY
api_base: https://api.openai.com/v1
# Anthropic models
- model_name: claude-3-5-sonnet-20241022
litellm_params:
model: anthropic/claude-3-5-sonnet-20241022
api_key: os.environ/ANTHROPIC_API_KEY
- model_name: claude-3-5-haiku-20241022
litellm_params:
model: anthropic/claude-3-5-haiku-20241022
api_key: os.environ/ANTHROPIC_API_KEY
# AWS Bedrock
- model_name: claude-bedrock
litellm_params:
model: bedrock/us.anthropic.claude-haiku-4-5-20251001-v1:0
aws_access_key_id: os.environ/AWS_ACCESS_KEY_ID
aws_secret_access_key: os.environ/AWS_SECRET_ACCESS_KEY
aws_region_name: us-east-1
litellm_settings:
master_key: os.environ/LITELLM_MASTER_KEY
Note: The model_name can be anything you choose. Claude Code will request whatever model you specify (via env vars or command line), and LiteLLM will route to the model configured in litellm_params.
Switch between models seamlessly:
# Use environment variables to set defaults
export ANTHROPIC_DEFAULT_SONNET_MODEL=claude-3-5-sonnet-20241022
export ANTHROPIC_DEFAULT_HAIKU_MODEL=claude-3-5-haiku-20241022
# Or specify directly
claude --model claude-3-5-sonnet-20241022 # Complex reasoning
claude --model claude-3-5-haiku-20241022 # Fast responses
claude --model claude-bedrock # Bedrock deployment
If you don't set environment variables, Claude Code uses these default model names:
| Purpose | Default Model Name (v2.1.14) |
|---|---|
| Main model | claude-sonnet-4-5-20250929 |
| Light tasks (subagents, summaries) | claude-haiku-4-5-20251001 |
| Planning mode | claude-opus-4-5-20251101 |
Your LiteLLM config should include these model names if you want Claude Code to work without setting environment variables:
model_list:
- model_name: claude-sonnet-4-5-20250929
litellm_params:
# Can be any provider - Anthropic, Bedrock, Vertex AI, etc.
model: anthropic/claude-sonnet-4-5-20250929
api_key: os.environ/ANTHROPIC_API_KEY
- model_name: claude-haiku-4-5-20251001
litellm_params:
model: anthropic/claude-haiku-4-5-20251001
api_key: os.environ/ANTHROPIC_API_KEY
- model_name: claude-opus-4-5-20251101
litellm_params:
model: anthropic/claude-opus-4-5-20251101
api_key: os.environ/ANTHROPIC_API_KEY
Warning: These default model names may change with new Claude Code versions. Check LiteLLM proxy logs for "model not found" errors to identify what Claude Code is requesting.