.agents/features/ai-providers.md
The AI Providers module lets platform admins configure one or more LLM backends (OpenAI, Anthropic, Google, Azure, OpenRouter, Cloudflare, or a custom OpenAI-compatible endpoint) for use by AI pieces inside flows. It also supports an auto-provisioned "Activepieces" provider backed by OpenRouter when the platform's aiCreditsEnabled plan flag is set, complete with a Stripe-integrated credit top-up system and monthly reset via a system job.
packages/server/api/src/app/ai/ — backend module (controller, service, entity)packages/shared/src/lib/management/ai-providers/index.ts — all shared Zod schemas, enums, and request/response typespackages/web/src/features/platform-admin/api/ai-provider-api.ts — frontend API clientpackages/web/src/features/platform-admin/hooks/ai-provider-hooks.ts — TanStack Query hookspackages/web/src/app/routes/platform/setup/ai/index.tsx — platform admin AI setup pagepackages/web/src/app/routes/platform/setup/ai/universal-pieces/ai-provider-card.tsx — per-provider card componentpackages/web/src/app/routes/platform/setup/ai/universal-pieces/upsert-provider-dialog.tsx — create/edit provider dialogpackages/web/src/app/routes/platform/setup/ai/universal-pieces/upsert-provider-config-form.tsx — provider config formpackages/web/src/app/routes/platform/setup/ai/universal-pieces/model-form-popover.tsx — model selection popoverpackages/web/src/features/agents/ai-model/index.tsx — AI model selector used in agent step settingspackages/web/src/features/agents/ai-model/hooks.ts — hooks for listing available models per providerapp.ts.aiCreditsEnabled plan flag.OPENROUTER_PROVISION_KEY env var is set and aiCreditsEnabled is true.openai, anthropic, google, azure, openrouter, cloudflare-gateway, custom, activepieces).auth field is AES-256-encrypted at rest; decrypted only for engine access.AIProvider: id, displayName, platformId (UNIQUE with provider), provider (AIProviderName enum), auth (EncryptedObject), config (JSON). Relation: platform (CASCADE).
| Provider | Auth Fields | Notes |
|---|---|---|
| OPENAI | apiKey | GPT models, responses model variant |
| ANTHROPIC | apiKey | Claude models |
| apiKey | Gemini models | |
| AZURE | apiKey, deploymentName, instanceName | Azure OpenAI |
| OPENROUTER | apiKey | 200+ models |
| CLOUDFLARE | apiKey, accountId, gatewayId | Proxied via Cloudflare Workers AI |
| CUSTOM | apiKey, baseUrl | OpenAI-compatible (LM Studio, Ollama) |
| ACTIVEPIECES | apiKey, apiKeyHash (auto-provisioned) | Uses OpenRouter, managed by platform |
Auto-created when aiCreditsEnabled flag is true (OPENROUTER_PROVISION_KEY env var set):
getOrCreateActivePiecesProviderAuthConfig() auto-creates providerenrichWithKeysIfNeeded() calls OpenRouter API to create API keyplatform.plan.includedAiCredits / 1000 (1000 credits = $1)AI_CREDIT_UPDATE_CHECK system job for auto-renewal and top-upModels listed per provider are cached in memory. Cache cleared daily at midnight via cron.
GET / — list providers (auto-creates ACTIVEPIECES if credits enabled)GET /:provider/config — get provider config + decrypted auth (engine-only access)GET /:provider/models — list available models (cached)POST / — create provider (validates credentials first)POST /:id — update provider (re-validates if auth changed, cannot update ACTIVEPIECES)DELETE /:id — delete provider (cannot delete ACTIVEPIECES)During flow execution, AI pieces call GET /v1/ai-providers/{provider}/config to get credentials. The engine token provides authorization.
The platform admin AI setup page lives at /platform/setup/ai. It renders an ai-provider-card for each configured provider and an "Add Provider" button that opens upsert-provider-dialog. The upsert-provider-config-form adapts its fields to the selected AIProviderName. The model-form-popover lets admins configure which models are exposed per provider.
Inside the builder, the agent step settings use features/agents/ai-model/index.tsx (with hooks.ts) to render a model selector that queries GET /v1/ai-providers/:provider/models via aiProviderApi.listModelsForProvider().