cookbook/anthropic_agent_sdk/README.md
A simple example showing how to use Claude's Agent SDK with LiteLLM as a proxy. This lets you use any LLM provider (OpenAI, Bedrock, Azure, etc.) through the Agent SDK.
pip install anthropic claude-agent-sdk litellm
# Simple start with Claude
litellm --model claude-sonnet-4-20250514
# Or with a config file
litellm --config config.yaml
Basic Agent (no MCP):
python main.py
Agent with MCP (DeepWiki2 for research):
python agent_with_mcp.py
If MCP connection fails, you can disable it:
USE_MCP=false python agent_with_mcp.py
That's it! You can now chat with the agent in your terminal.
While chatting, you can use these commands:
models - List all available models (fetched from your LiteLLM proxy)model - Switch to a different modelclear - Start a new conversationquit or exit - End the chatThe chat automatically fetches available models from your LiteLLM proxy's /models endpoint, so you'll always see what's currently configured.
Set these environment variables if needed:
export LITELLM_PROXY_URL="http://localhost:4000"
export LITELLM_API_KEY="sk-1234"
export LITELLM_MODEL="bedrock-claude-sonnet-4.5"
Or just use the defaults - it'll connect to http://localhost:4000 by default.
main.py - Basic interactive agent without MCPagent_with_mcp.py - Agent with MCP server integration (DeepWiki2)common.py - Shared utilities and functionsconfig.example.yaml - Example LiteLLM configurationrequirements.txt - Python dependenciesIf you want to use multiple models, create a config.yaml (see config.example.yaml):
model_list:
- model_name: bedrock-claude-sonnet-4
litellm_params:
model: "bedrock/us.anthropic.claude-sonnet-4-20250514-v1:0"
aws_region_name: "us-east-1"
- model_name: bedrock-claude-sonnet-4.5
litellm_params:
model: "bedrock/us.anthropic.claude-sonnet-4-5-20250929-v1:0"
aws_region_name: "us-east-1"
Then start LiteLLM with: litellm --config config.yaml
The key is pointing the Agent SDK to LiteLLM instead of directly to Anthropic:
# Point to LiteLLM gateway (not Anthropic)
os.environ["ANTHROPIC_BASE_URL"] = "http://localhost:4000"
os.environ["ANTHROPIC_API_KEY"] = "sk-1234" # Your LiteLLM key
# Use any model configured in LiteLLM
options = ClaudeAgentOptions(
model="bedrock-claude-sonnet-4", # or gpt-4, or anything else
system_prompt="You are a helpful assistant.",
max_turns=50,
)
Note: Don't add /anthropic to the base URL - LiteLLM handles the routing automatically.
Connection errors?
litellm --model your-modelhttp://localhost:4000)Authentication errors?
Model not found?
litellm --model your-model to test it worksAgent with MCP stuck or failing?
http://localhost:4000/mcp/deepwiki2USE_MCP=false python agent_with_mcp.pypython main.py