.agents/features/chat.md
A platform-level AI chat assistant that lets users interact with an LLM to manage their Activepieces projects through natural language. The chat connects to the platform's configured AI provider, streams responses via the Vercel AI SDK, and exposes Activepieces resources (flows, tables, connections, runs) as callable tools through the project's MCP server. Conversations are persisted per-user with support for message compaction, file attachments, multi-project context switching, and a tool approval gate for destructive operations.
packages/server/api/src/app/ee/chat/chat.module.ts — module registration with chatEnabled plan gatepackages/server/api/src/app/ee/chat/chat-controller.ts — HTTP endpoints (conversations CRUD, messages, tool approvals)packages/server/api/src/app/ee/chat/chat-service.ts — core business logic (conversation management, message streaming)packages/server/api/src/app/ee/chat/chat-conversation-entity.ts — ChatConversation TypeORM entitypackages/server/api/src/app/ee/chat/chat-model-factory.ts — creates AI SDK LanguageModel from provider config (OpenAI, Anthropic, Google, Azure, Bedrock, Cloudflare, Custom)packages/server/api/src/app/ee/chat/chat-compaction.ts — long-conversation context management via summarizationpackages/server/api/src/app/ee/chat/chat-approval-gate.ts — Redis pub/sub gate for tool execution approval (5-min timeout)packages/server/api/src/app/ee/chat/chat-file-utils.ts — file attachment processing (base64, MIME validation, 10MB limit)packages/server/api/src/app/ee/chat/tools/chat-tools.ts — local LLM tools (title, project selection, action execution, cross-project listing)packages/server/api/src/app/ee/chat/mcp/chat-mcp.ts — connects to Activepieces MCP server for project-scoped tools with approval wrappingpackages/server/api/src/app/ee/chat/history/chat-history.ts — reconstructs chat history from AI SDK ModelMessage formatpackages/server/api/src/app/ee/chat/prompt/chat-prompt.ts — builds system prompt from markdown templates in src/assets/prompts/packages/shared/src/lib/ee/chat/index.ts — shared Zod schemas and types (ChatConversation, request DTOs, ChatHistoryMessage)packages/web/src/app/routes/chat-with-ai/index.tsx — main chat page componentpackages/web/src/app/routes/chat-with-ai/ai-chat-box.tsx — chat interface with provider check, message streaming, tool approvalspackages/web/src/app/routes/chat-with-ai/conversation-list.tsx — conversation history sidebarpackages/web/src/app/routes/chat-with-ai/components/ — sub-components (input, messages, model selector, project selector, tool approval form)packages/web/src/features/chat/lib/chat-api.ts — API client for /v1/chat/* endpointspackages/web/src/features/chat/lib/use-chat.ts — useAgentChat() hook managing conversation statepackages/web/src/features/chat/lib/use-tool-approval.ts — hook for tool approval requestsplatform.plan.chatEnabled is trueplatform.plan.chatEnabled is trueap_set_session_title, ap_select_project, ap_run_one_time_action, ap_list_across_projectsenabledForChat flag; the chat resolves the first enabled provider and its default modelChatConversation: id, platformId, userId, projectId (nullable), title (nullable), modelName (nullable), messages (JSONB array of ModelMessage), summary (text, nullable — compaction summary), summarizedUpToIndex (int, nullable — index up to which messages are summarized).
idx_chat_conversation_platform_user_created_id on (platformId, userId, created, id)createConversation() — creates a new conversation for a user on a platformlistConversations() — cursor-paginated list of user's conversations, ordered by creation date descendinggetConversationOrThrow() — fetches a conversation, enforcing ownership (platformId + userId)updateConversation() — updates title and/or modelNamedeleteConversation() — deletes a conversation after ownership checkgetMessages() — reconstructs ChatHistoryMessage[] from stored ModelMessage[]setProjectContext() — sets or clears the project scope, verifying user has accesssendMessage() — the main streaming flow: resolves provider, connects MCP, builds prompt, runs streamText() with compaction, persists assistant response on completionap_set_session_title — auto-names the conversation after the first exchangeap_select_project — switches project context (scopes MCP tools to that project)ap_run_one_time_action — executes a single piece action ad-hoc (e.g. "check my inbox"); auto-discovers connections across projectsap_list_across_projects — lists flows, tables, runs, or connections across all user-accessible projectsPOST /v1/chat/conversations — create conversationGET /v1/chat/conversations — list conversations (cursor, limit)GET /v1/chat/conversations/:id — get conversationPOST /v1/chat/conversations/:id — update conversation (title, modelName)DELETE /v1/chat/conversations/:id — delete conversationGET /v1/chat/conversations/:id/messages — get conversation messagesPOST /v1/chat/conversations/:id/messages — send message (streaming response)POST /v1/chat/tool-approvals/:gateId — approve or deny a tool executionPOST /v1/chat/conversations/:id/project-context — set project contextAll endpoints require PrincipalType.USER authentication at the platform level.
POST /conversations/:id/messagesstreamText() streams the LLM response with local tools + MCP tools availablePOST /tool-approvals/:gateId, unblocking the gate via Redis pub/sub