examples/tanstack-db-web-starter/README.md
Welcome to your new TanStack Start / DB + Electric app!
You need:
You can see compatible versions in the .tool-versions file.
Make sure you have Docker running. Docker is used to run the Postgres and Electric services defined in docker-compose.yaml.
Make sure you have Caddy installed and have installed its root certificate using:
caddy trust # may require sudo
Electric benefits significantly from HTTP/2 multiplexing. HTTP/2 requires HTTPS. Caddy is necessary for HTTPS to work in local development.
Create a new project based on this starter:
npx gitpick electric-sql/electric/tree/main/examples/tanstack-db-web-starter my-tanstack-db-project
cd my-tanstack-db-project
Copy the .env.example file to .env:
cp .env.example .env
[!Tip] You can edit the values in the
.envfile. The default values are configured for local development with Docker. You can run against a different Postgres and Electric, for example using the Electric Cloud, by changing theDATABASE_URLandELECTRIC_URL.
Install the dependencies:
pnpm install
Start the backend services (Postgres and Electric) running in the background using Docker:
pnpm backend:up
Apply the database migrations:
pnpm migrate
Start the dev server:
pnpm dev
Open the application on https://localhost:5173.
[!Tip] If you run into any issues, see the troubleshooting section below.
This starter implements a secure authentication pattern for Electric sync. If you're new to Electric, it's worth understanding how the pieces fit together since this is a novel part of the stack.
Electric is a Postgres sync engine that streams data to clients via HTTP. Unlike traditional REST APIs where each request is authenticated individually, Electric maintains persistent sync connections. This requires a different approach to authorization.
The starter uses a Shape Proxy Pattern where:
┌─────────────────────────────────────────────────────────────┐
│ Client (Browser) │
│ │
│ ┌──────────────┐ ┌──────────────┐ ┌──────────────┐ │
│ │ TanStack DB │ │ Electric │ │ tRPC │ │
│ │ Collection │───▶│ Shape Client │ │ Client │ │
│ └──────────────┘ └──────────────┘ └──────────────┘ │
│ │ │ │ │
└─────────│───────────────────│────────────────────│──────────┘
│ │ │
│ Session Cookie │ (automatic) │
│ ▼ ▼
┌─────────│───────────────────────────────────────────────────┐
│ │ Server (TanStack Start) │
│ │ │
│ │ ┌────────────────────────────────────────────┐ │
│ │ │ Shape Proxy Routes │ │
│ │ │ (/api/todos, /api/projects, etc) │ │
│ │ │ │ │
│ │ │ 1. Validate session │ │
│ │ │ 2. Add WHERE user_id = ? │ │
│ │ │ 3. Forward to Electric │ │
│ │ └────────────────────────────────────────────┘ │
│ │ │ │
│ │ ┌────────────────────│───────────────────────┐ │
│ │ │ tRPC Router ▼ │ │
│ └───▶│ - Validates session │ │
│ │ - Checks ownership before mutations │ │
│ │ - Returns transaction IDs for sync │ │
│ └────────────────────────────────────────────┘ │
│ │ │
└───────────────────────────────────│─────────────────────────┘
│
┌────────────────┴────────────────┐
▼ ▼
┌──────────┐ ┌──────────┐
│ Electric │ │ Postgres │
│ Server │◀────────────────────▶│ Database │
└──────────┘ └──────────┘
Each table that syncs via Electric has a corresponding API route that acts as an authenticated proxy. Here's what happens in /api/todos:
const serve = async ({ request }: { request: Request }) => {
// 1. Validate the session
const session = await auth.api.getSession({ headers: request.headers })
if (!session) {
return new Response(JSON.stringify({ error: "Unauthorized" }), {
status: 401,
})
}
// 2. Build the Electric URL with row-level filtering
const originUrl = prepareElectricUrl(request.url)
originUrl.searchParams.set("table", "todos")
// Only sync rows where the user has access
const filter = `'${session.user.id}' = ANY(user_ids)`
originUrl.searchParams.set("where", filter)
// 3. Proxy the request to Electric
return proxyElectricRequest(originUrl)
}
This pattern ensures:
WHERE clause filters at the database levelWhile Electric handles reads, mutations go through tRPC with additional authorization:
// src/lib/trpc/todos.ts
delete: authedProcedure
.input(deleteTodoSchema)
.mutation(async ({ ctx, input }) => {
// Find the todo first
const [todo] = await ctx.db.select().from(todosTable).where(eq(id, input.id))
// Verify the user has access
if (!todo.user_ids.includes(ctx.session.user.id)) {
throw new TRPCError({ code: "FORBIDDEN" })
}
// Proceed with deletion
// ...
})
When you add a new table (see Adding a New Table), you need to:
user_id column)The "Adding a New Table" section below shows the complete pattern.
For more details on Electric's authentication model, see:
Here's how to add a new table to your app (using a "categories" table as an example):
Add your table to src/db/schema.ts:
export const categoriesTable = pgTable("categories", {
id: integer().primaryKey().generatedAlwaysAsIdentity(),
name: varchar({ length: 255 }).notNull(),
color: varchar({ length: 7 }), // hex color
created_at: timestamp({ withTimezone: true }).notNull().defaultNow(),
user_id: text("user_id")
.notNull()
.references(() => users.id, { onDelete: "cascade" }),
})
// Add Zod schemas
export const selectCategorySchema = createSelectSchema(categoriesTable)
export const createCategorySchema = createInsertSchema(categoriesTable).omit({
created_at: true,
})
export const updateCategorySchema = createUpdateSchema(categoriesTable)
# Generate migration file
pnpm migrate:generate
# Apply migration to database
pnpm migrate
Create src/routes/api/categories.ts:
import { createFileRoute } from "@tanstack/react-router"
import { auth } from "@/lib/auth"
import { prepareElectricUrl, proxyElectricRequest } from "@/lib/electric-proxy"
const serve = async ({ request }: { request: Request }) => {
const session = await auth.api.getSession({ headers: request.headers })
if (!session) {
return new Response(JSON.stringify({ error: "Unauthorized" }), {
status: 401,
headers: { "content-type": "application/json" },
})
}
const originUrl = prepareElectricUrl(request.url)
originUrl.searchParams.set("table", "categories")
// Filter to user's own categories
const filter = `user_id = '${session.user.id}'`
originUrl.searchParams.set("where", filter)
return proxyElectricRequest(originUrl)
}
export const Route = createFileRoute("/api/categories")({
server: {
handlers: {
GET: serve,
},
},
})
Create src/lib/trpc/categories.ts:
import { router, authedProcedure, generateTxId } from "@/lib/trpc"
import { z } from "zod"
import { eq, and } from "drizzle-orm"
import {
categoriesTable,
createCategorySchema,
updateCategorySchema,
} from "@/db/schema"
export const categoriesRouter = router({
create: authedProcedure
.input(createCategorySchema)
.mutation(async ({ ctx, input }) => {
const result = await ctx.db.transaction(async (tx) => {
const txid = await generateTxId(tx)
const [newItem] = await tx
.insert(categoriesTable)
.values({ ...input, user_id: ctx.session.user.id })
.returning()
return { item: newItem, txid }
})
return result
}),
// Add update and delete following the same pattern...
})
Add to src/routes/api/trpc/$.ts:
import { categoriesRouter } from "./trpc/categories"
export const appRouter = router({
// ... existing routers
categories: categoriesRouter,
})
Add to src/lib/collections.ts:
export const categoriesCollection = createCollection(
electricCollectionOptions({
id: "categories",
shapeOptions: {
url: "/api/categories",
parser: {
timestamptz: (date: string) => new Date(date),
},
},
schema: selectCategorySchema,
getKey: (item) => item.id,
onInsert: async ({ transaction }) => {
const { modified: newCategory } = transaction.mutations[0]
const result = await trpc.categories.create.mutate({
name: newCategory.name,
color: newCategory.color,
})
return { txid: result.txid }
},
// Add onUpdate, onDelete as needed
})
)
Preload in route loaders and use with useLiveQuery:
// In route loader
export const Route = createFileRoute("/my-route")({
loader: async () => {
await Promise.all([categoriesCollection.preload()])
},
})
// In component
const { data: categories } = useLiveQuery((q) =>
q.from({ categoriesCollection }).orderBy(/* ... */)
)
That's it! Your new table is now fully integrated with Electric sync, tRPC mutations, and TanStack DB queries.
Electric SQL's shape delivery benefits significantly from HTTP/2 multiplexing. Without HTTP/2, each shape subscription creates a new HTTP/1.1 connection, which browsers limit to 6 concurrent connections per domain. This creates a bottleneck that makes shapes appear slow.
Caddy provides HTTP/2 support with automatic HTTPS, giving you:
The Vite development server runs on HTTP/1.1 only, so Caddy acts as a reverse proxy to upgrade the connection.
Once you've installed Caddy, install its root certificate using:
caddy trust
This is necessary for HTTP/2 to work without SSL warnings/errors in the browser.
pnpm devCaddyfile is automatically generated with your project namehttps://<project-name>.localhosthttp://localhost:5173 still works but will be slower for Electric shapesIf Caddy fails to start:
Test Caddy manually:
caddy start
Check certificate trust:
caddy trust
# To remove later: caddy untrust
Verify Caddyfile was generated:
Look for a Caddyfile in your project root after running pnpm dev
Stop conflicting Caddy instances:
caddy stop
Check for port conflicts: Caddy needs ports 80 and 443 available
| Issue | Symptoms | Solution |
|---|---|---|
| Docker not running | docker compose ps shows nothing | Start Docker Desktop/daemon |
| Caddy not trusted | SSL warnings in browser | Run caddy trust (see Caddy section below) |
| Port conflicts | Postgres (54321) or Electric (30000) in use | Stop conflicting services or change ports in docker-compose.yaml |
| Missing .env | Database connection errors | Copy .env.example to .env |
| Caddy fails to start | Caddy exited with code 1 | Run caddy start manually to see the error |
For troubleshooting, these commands are helpful:
# Check Docker services status
docker compose ps
# View Electric and Postgres logs
docker compose logs -f electric postgres
# Test database connectivity
psql $DATABASE_URL -c "SELECT 1"
# Check Caddy status
caddy start
To build this application for production:
pnpm build
Before deploying to production, ensure you have configured:
# Authentication - REQUIRED in production
BETTER_AUTH_SECRET=your-secret-key-here
# Electric Cloud (if using hosted Electric)
ELECTRIC_SOURCE_ID=your-source-id
ELECTRIC_SOURCE_SECRET=your-source-secret
# Database (adjust for your production database)
DATABASE_URL=postgresql://user:pass@your-prod-db:5432/dbname
⚠️ Important: The current setup allows any email/password combination to work in development. This is automatically disabled in production, but you need to:
src/lib/auth.ts (Google, GitHub, etc.)trustedOrigins settings for your production domainsNODE_ENV=productionBETTER_AUTH_SECRET (minimum 32 characters)The starter includes an AGENTS.md. Depending on which AI coding tool you use, you may need to copy/move it to the right file name e.g. .cursor/rules.
This project uses Tailwind CSS for styling.
This project uses TanStack Router. The initial setup is a file based router. Which means that the routes are managed as files in src/routes.
To add a new route to your application just add another a new file in the ./src/routes directory.
TanStack will automatically generate the content of the route file for you.
Now that you have two routes you can use a Link component to navigate between them.
To use SPA (Single Page Application) navigation you will need to import the Link component from @tanstack/react-router.
import { Link } from "@tanstack/react-router"
Then anywhere in your JSX you can use it like so:
<Link to="/about">About</Link>
This will create a link that will navigate to the /about route.
More information on the Link component can be found in the Link documentation.
In the File Based Routing setup the layout is located in src/routes/__root.tsx. Anything you add to the root route will appear in all the routes. The route content will appear in the JSX where you use the <Outlet /> component.
Here is an example layout that includes a header:
import { Outlet, createRootRoute } from "@tanstack/react-router"
import { TanStackRouterDevtools } from "@tanstack/react-router-devtools"
import { Link } from "@tanstack/react-router"
export const Route = createRootRoute({
component: () => (
<>
<header>
<nav>
<Link to="/">Home</Link>
<Link to="/about">About</Link>
</nav>
</header>
<Outlet />
<TanStackRouterDevtools />
</>
),
})
The <TanStackRouterDevtools /> component is not required so you can remove it if you don't want it in your layout.
More information on layouts can be found in the Layouts documentation.
There are multiple ways to fetch data in your application. You can use TanStack DB to fetch data from a server. But you can also use the loader functionality built into TanStack Router to load the data for a route before it's rendered.
For example:
const peopleRoute = createRoute({
getParentRoute: () => rootRoute,
path: "/people",
loader: async () => {
const response = await fetch("https://swapi.dev/api/people")
return response.json() as Promise<{
results: {
name: string
}[]
}>
},
component: () => {
const data = peopleRoute.useLoaderData()
return (
<ul>
{data.results.map((person) => (
<li key={person.name}>{person.name}</li>
))}
</ul>
)
},
})
Loaders simplify your data fetching logic dramatically. Check out more information in the Loader documentation.
TanStack DB gives you robust support for real-time sync, live queries and local writes. With no stale data, super fast re-rendering and sub-millisecond cross-collection queries — even for large complex apps.
Electric is a Postgres sync engine. It solves the hard problems of sync for you, including partial replication, fan-out, and data delivery.
Built on a TypeScript implementation of differential dataflow, TanStack DB provides:
Collections - Typed sets of objects that can mirror a backend table or be populated with filtered views like pendingTodos or decemberNewTodos. Collections are just JavaScript data that you can load on demand.
Live Queries - Run reactively against and across collections with support for joins, filters and aggregates. Powered by differential dataflow, query results update incrementally without re-running the whole query.
Transactional Optimistic Mutations - Batch and stage local changes across collections with immediate application of local optimistic updates. Sync transactions to the backend with automatic rollbacks and management of optimistic state.
This starter proxies ElectricSQL shapes through server routes for auth-aware filtering. Use the proxied endpoints in shapeOptions.url:
import { createCollection } from "@tanstack/react-db"
import { electricCollectionOptions } from "@tanstack/electric-db-collection"
export const todoCollection = createCollection(
electricCollectionOptions<Todo>({
id: "todos",
schema: todoSchema,
// Electric syncs data using "shapes" - filtered views on database tables
shapeOptions: {
url: "/api/todos",
parser: {
timestamptz: (s: string) => new Date(s),
},
},
getKey: (item) => item.id,
onInsert: async ({ transaction }) => {
const { modified: newTodo } = transaction.mutations[0]
const result = await trpc.todos.create.mutate({
text: newTodo.text,
completed: newTodo.completed,
// ... other fields
})
return { txid: result.txid }
},
// You can also implement onUpdate, onDelete as needed
})
)
Apply mutations with local optimistic state that automatically syncs:
const AddTodo = () => {
return (
<Button
onClick={() =>
todoCollection.insert({
id: crypto.randomUUID(),
text: "🔥 Make app faster",
completed: false,
})
}
/>
)
}
Use live queries to read data reactively across collections:
import { useLiveQuery, eq } from "@tanstack/react-db"
const Todos = () => {
// Read data using live queries with cross-collection joins
const { data: todos } = useLiveQuery((q) =>
q
.from({ todo: todoCollection })
.join({ list: listCollection }, ({ list, todo }) =>
eq(list.id, todo.list_id)
)
.where(({ list }) => eq(list.active, true))
.select(({ list, todo }) => ({
id: todo.id,
status: todo.status,
text: todo.text,
list_name: list.name,
}))
)
return (
<ul>
{todos.map((todo) => (
<li key={todo.id}>
{todo.text} - {todo.name}
</li>
))}
</ul>
)
}
This pattern provides blazing fast, cross-collection live queries and local optimistic mutations with automatically managed optimistic state, all synced in real-time with ElectricSQL.
This starter uses tRPC v10 for type-safe mutations while Electric handles real-time reads:
// src/lib/trpc-client.ts
import { createTRPCProxyClient, httpBatchLink } from "@trpc/client"
import type { AppRouter } from "@/routes/api/trpc/$"
export const trpc = createTRPCProxyClient<AppRouter>({
links: [
httpBatchLink({
url: "/api/trpc",
async headers() {
return {
cookie: typeof document !== "undefined" ? document.cookie : "",
}
},
}),
],
})
The collection hooks use tRPC for all mutations, providing full end-to-end type safety:
// In your collection configuration
onUpdate: async ({ transaction }) => {
const { modified: updatedTodo } = transaction.mutations[0]
const result = await trpc.todos.update.mutate({
id: updatedTodo.id,
data: {
text: updatedTodo.text,
completed: updatedTodo.completed,
},
})
return { txid: result.txid }
},
API Routes:
/api/trpc/* - tRPC mutations with full type safety/api/auth/* - Authentication via better-auth/api/projects, /api/todos, /api/users - Electric sync shapes for readsFollow these patterns to get the most out of this starter:
useLiveQuery with collections, not tRPC queriescollection.insert(), not trpc.create.mutate() directly