packages/ai-gateway/VERTEX_PROXY_README.md
This proxy allows the Claude Agent SDK to use Vertex AI without requiring users to set up GCP credentials. Users just log in to Screenpipe, and the proxy handles authentication with Vertex AI using Screenpipe's service account.
CLAUDE_CODE_USE_VERTEX=1ANTHROPIC_VERTEX_BASE_URL=https://ai-gateway.i-f9f.workers.dev (or your worker URL)CLAUDE_CODE_SKIP_VERTEX_AUTH=1cd packages/ai-gateway
# Set the service account JSON (paste entire JSON, escape quotes)
wrangler secret put VERTEX_SERVICE_ACCOUNT_JSON
# Set project ID
wrangler secret put VERTEX_PROJECT_ID
wrangler deploy
.dev.vars.example to .dev.varsnpm run dev./test-vertex-local.shIn your Tauri app, when spawning the Agent SDK:
// Set environment variables for Agent SDK
const env = {
CLAUDE_CODE_USE_VERTEX: '1',
ANTHROPIC_VERTEX_BASE_URL: 'https://ai-gateway.i-f9f.workers.dev',
CLAUDE_CODE_SKIP_VERTEX_AUTH: '1',
// User's auth token for the proxy
SCREENPIPE_AUTH_TOKEN: userAuthToken,
};
// The Agent SDK will use these env vars automatically
Proxies Anthropic Messages API requests to Vertex AI.
Headers:
Authorization: Bearer <user-token> - User's Screenpipe auth tokenContent-Type: application/jsonBody: Standard Anthropic Messages API format
Example:
curl -X POST https://ai-gateway.i-f9f.workers.dev/v1/messages \
-H "Authorization: Bearer $USER_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"model": "claude-3-5-sonnet-v2@20241022",
"max_tokens": 1024,
"messages": [{"role": "user", "content": "Hello"}]
}'
Models depend on what's enabled in your GCP project. Common options:
claude-sonnet-4@20250514 (recommended, default)claude-opus-4@20250514claude-3-5-sonnet-v2@20241022claude-3-5-haiku@20241022To check available models, visit the Vertex AI Model Garden in your GCP Console.
VERTEX_SERVICE_ACCOUNT_JSON secret is set in the worker