docs/design/provider-refactoring.md
Issue: #283 Discussion: #122 Branch: feat/refactor-provider-by-protocol
Current State: Each Provider requires a predefined field in ProvidersConfig
type ProvidersConfig struct {
Anthropic ProviderConfig `json:"anthropic"`
OpenAI ProviderConfig `json:"openai"`
DeepSeek ProviderConfig `json:"deepseek"`
Qwen ProviderConfig `json:"qwen"`
Cerebras ProviderConfig `json:"cerebras"`
VolcEngine ProviderConfig `json:"volcengine"`
// ... every new provider requires changes here
}
Problems:
CreateProvider function in http_provider.go has 200+ lines of switch-caseRecent PRs demonstrate this issue:
| PR | Provider | Code Changes |
|---|---|---|
| #365 | Qwen | +17 lines to http_provider.go |
| #333 | Cerebras | +17 lines to http_provider.go |
| #368 | Volcengine | +18 lines to http_provider.go |
Each OpenAI-compatible Provider requires:
config.go to add configuration fieldhttp_provider.go to add switch case{
"agents": {
"defaults": {
"provider": "deepseek", // need to know provider name
"model": "deepseek-chat"
}
}
}
Problem: Agent needs to know both provider and model, adding complexity.
Inspired by LiteLLM design:
protocol/model_name format, e.g., openai/gpt-5.4, anthropic/claude-sonnet-4.6{
"model_list": [
{
"model_name": "deepseek-chat",
"model": "openai/deepseek-chat",
"api_base": "https://api.deepseek.com/v1",
"api_key": "sk-xxx"
},
{
"model_name": "gpt-5.4",
"model": "openai/gpt-5.4",
"api_key": "sk-xxx"
},
{
"model_name": "claude-sonnet-4.6",
"model": "anthropic/claude-sonnet-4.6",
"api_key": "sk-xxx"
},
{
"model_name": "gemini-3-flash",
"model": "antigravity/gemini-3-flash",
"auth_method": "oauth"
},
{
"model_name": "my-company-llm",
"model": "openai/company-model-v1",
"api_base": "https://llm.company.com/v1",
"api_key": "xxx"
}
],
"agents": {
"defaults": {
"model": "deepseek-chat",
"max_tokens": 8192,
"temperature": 0.7
}
}
}
type Config struct {
ModelList []ModelConfig `json:"model_list"` // new
Providers ProvidersConfig `json:"providers"` // old, deprecated
Agents AgentsConfig `json:"agents"`
Channels ChannelsConfig `json:"channels"`
// ...
}
type ModelConfig struct {
// Required
ModelName string `json:"model_name"` // user-facing name (alias)
Model string `json:"model"` // protocol/model, e.g., openai/gpt-5.4
// Common config
APIBase string `json:"api_base,omitempty"`
APIKey string `json:"api_key,omitempty"`
Proxy string `json:"proxy,omitempty"`
// Special provider config
AuthMethod string `json:"auth_method,omitempty"` // oauth, token
ConnectMode string `json:"connect_mode,omitempty"` // stdio, grpc
// Optional optimizations
RPM int `json:"rpm,omitempty"` // rate limit
MaxTokensField string `json:"max_tokens_field,omitempty"` // max_tokens or max_completion_tokens
}
Identify protocol via prefix in model field:
| Prefix | Protocol | Description |
|---|---|---|
openai/ | OpenAI-compatible | Most common, includes DeepSeek, Qwen, Groq, etc. |
anthropic/ | Anthropic | Claude series specific |
antigravity/ | Antigravity | Google Cloud Code Assist |
gemini/ | Gemini | Google Gemini native API |
| Problem | Old Approach | New Approach |
|---|---|---|
| Add OpenAI-compatible Provider | Change 3 code locations | Add one config entry |
| Agent specifies model | Need provider + model | Only need model |
| Code duplication | Each Provider duplicates logic | Share protocol implementation |
| Multi-Agent support | Complex | Naturally compatible |
{
"model_list": [...],
"agents": {
"defaults": {
"model": "deepseek-chat"
},
"coder": {
"model": "gpt-5.4",
"system_prompt": "You are a coding assistant..."
},
"translator": {
"model": "claude-sonnet-4.6"
}
}
}
Each Agent only needs to specify model (corresponds to model_name in model_list).
LiteLLM (most mature open-source LLM Proxy) uses similar design:
model_list:
- model_name: gpt-4o
litellm_params:
model: openai/gpt-5.4
api_key: xxx
- model_name: my-custom
litellm_params:
model: openai/custom-model
api_base: https://my-api.com/v1
Support both providers and model_list:
func (c *Config) GetModelConfig(modelName string) (*ModelConfig, error) {
// Prefer new config
if len(c.ModelList) > 0 {
return c.findModelByName(modelName)
}
// Backward compatibility with old config
if !c.Providers.IsEmpty() {
logger.Warn("'providers' config is deprecated, please migrate to 'model_list'")
return c.convertFromProviders(modelName)
}
return nil, fmt.Errorf("model %s not found", modelName)
}
providers as deprecated in documentationproviders supportagents.defaults.provider fieldmodel_listOld Config:
{
"providers": {
"deepseek": {
"api_key": "sk-xxx",
"api_base": "https://api.deepseek.com/v1"
}
},
"agents": {
"defaults": {
"provider": "deepseek",
"model": "deepseek-chat"
}
}
}
New Config:
{
"model_list": [
{
"model_name": "deepseek-chat",
"model": "openai/deepseek-chat",
"api_base": "https://api.deepseek.com/v1",
"api_key": "sk-xxx"
}
],
"agents": {
"defaults": {
"model": "deepseek-chat"
}
}
}
ModelConfig structConfig.ModelList fieldGetModelConfig(modelName) methodmodel_name uniqueness validationpkg/providers/factory/ directoryCreateProviderFromModelConfig()http_provider.go to openai/provider.goCreateProvider()| Risk | Mitigation |
|---|---|
| Breaking existing configs | Compatibility period keeps old config working |
| User migration cost | Provide automatic migration script |
| Special Provider incompatibility | Keep auth_method and other extension fields |