docs/zai/provider.md
Support z.ai (GLM) as an optional upstream for Anthropic-compatible requests (/v1/messages), without applying any Google/Gemini-specific transformations when z.ai is selected.
This keeps compatibility high (request/response shapes stay Anthropic-like) and avoids coupling z.ai traffic to the Google account pool.
We added an optional “z.ai provider” that:
proxy.zai.*)./v1/messages and /v1/messages/count_tokens to a z.ai Anthropic-compatible base URL.Schema: src-tauri/src/proxy/config.rs
ZaiConfig in src-tauri/src/proxy/config.rsZaiDispatchMode in src-tauri/src/proxy/config.rsKey fields:
proxy.zai.enabledproxy.zai.base_url (default https://api.z.ai/api/anthropic)proxy.zai.api_keyproxy.zai.dispatch_mode:
offexclusivepooledfallbackproxy.zai.models default mapping for claude-* request models:
opus, sonnet, haikuEntry point: src-tauri/src/proxy/handlers/claude.rs
handle_messages(...) decides whether to route the request to z.ai or to the existing Google-backed flow.pooled mode uses round-robin across (google_accounts + 1) slots, where slot 0 is z.ai.Provider implementation: src-tauri/src/proxy/providers/zai_anthropic.rs
Authorization / x-api-key) and forwards the request body as-is.src/pages/ApiProxy.tsx) and set dispatch_mode=exclusive.
POST /v1/messages.