docs/development/basic/architecture.mdx
LobeHub is an open-source AI Agent platform built on Next.js, enabling users to interact with AI through natural language, use tools, manage knowledge bases, and more. The following is an overview of LobeHub's architecture design.
The overall architecture of LobeHub consists of the following core layers:
+---------------------+--------------------------------------------------+
| Layer | Description |
+---------------------+--------------------------------------------------+
| Frontend | Next.js RSC + React Router DOM hybrid SPA |
| Backend API | RESTful WebAPI + tRPC Routers |
| Runtime | Model Runtime + Agent Runtime |
| Auth | Better Auth (email/password + SSO) |
| Data Storage | PostgreSQL + Redis + S3 |
| Marketplace | Agent Market + MCP Tool Market |
+---------------------+--------------------------------------------------+
The frontend uses the Next.js framework with a Next.js RSC + React Router DOM hybrid routing approach: Next.js App Router handles server-rendered pages (e.g., auth pages), while React Router DOM powers the main SPA.
| Router | Use Case | Location |
|---|---|---|
| Next.js App Router | Auth pages, SSR, static routes | src/app/(backend)/ |
| React Router DOM | Main chat SPA, agent interfaces | src/spa/ + src/routes/ |
Why hybrid? SSR delivers auth page security, SEO, and static page performance; SPA delivers instant navigation and reactive state for interactive chat and agent interfaces — the best of both worlds.
Key tech stack:
@lobehub/ui, antdComponent priority when building UI: project components (src/components/) → @lobehub/ui → Ant Design (antd). Always prefer project-level components for consistency.
Frontend code is organized by responsibility under src/. See Directory Structure for details.
The backend provides two API styles:
src/app/(backend)/webapi/): Handles endpoints requiring special processing such as chat streaming, TTS, and file servingsrc/server/routers/): Type-safe main business routes, grouped by runtime:
lambda/ — Main business (agent, session, message, topic, file, knowledge, settings, etc.)async/ — Long-running async operations (file processing, image generation, RAG evaluation)tools/ — Tool invocations (search, MCP, market)mobile/ — Mobile-specific routes@lobechat/model-runtime (packages/model-runtime/) is the LLM API adapter layer that normalizes API differences across 30+ AI providers (OpenAI, Anthropic, Google, Bedrock, Ollama, etc.), providing a unified calling interface. Each provider has its own adapter implementation. It is stateless — each call is independent.
@lobechat/agent-runtime (packages/agent-runtime/) is the agent orchestration engine that sits above Model Runtime, driving the full lifecycle of multi-step AI agent behavior:
GroupOrchestrationRuntime supports Supervisor + Executor pattern for multi-agent collaborationIn short: Model Runtime handles "how to communicate with an LLM provider"; Agent Runtime handles "how to run a complete agent using LLMs, tools, and human approvals."
Agent execution flow:
LobeHub uses Better Auth as the authentication framework, supporting:
Auth configuration is in src/auth.ts, with related routes under src/app/(backend)/api/.
+---------------+----------------------------------------------+
| Storage | Usage |
+---------------+----------------------------------------------+
| PostgreSQL | Primary database for users, sessions, |
| | messages, agent configs, etc. |
| Redis | Caching, session state, rate limiting |
| S3 | File storage (uploads, images, knowledge |
| | base files, etc.) |
+---------------+----------------------------------------------+
Database operations use Drizzle ORM, with schemas defined in packages/database/src/schemas/.