website/blog/posts/2024-11-21-local-first-with-your-existing-api.md
One of the exciting things about local-first software is the potential to eliminate APIs and microservices. Instead of coding across the network, you code against a local store, data syncs in the background and your stack is suddenly much simpler.
But what if you don't want to eliminate your API? What if you want or need to keep it. How do you develop local-first software then?
With Electric, you can develop local-first apps incrementally, using your existing API.
I gave a talk on this subject at the second Local-first meetup in Berlin in December 2024:
<div class="embed-container"> <YoutubeEmbed video-id="gSGEFYuLuho" /> </div>There's a great book by Harvey Molotch called Where stuff comes from which talks about how nothing exists in isolation. One of his examples is a toaster.
<figure style="max-width: 512px;"> <div style="position:relative;height:0;padding-bottom:56.25%"> <iframe src="https://embed.ted.com/talks/thomas_thwaites_how_i_built_a_toaster_from_scratch?subtitle=en" width="512px" height="288px" title="How I built a toaster -- from scratch" style="position:absolute;left:0;top:0;width:100%;height:100%" frameborder="0" scrolling="no" allowfullscreen> </iframe> </div> </figure>At first glance, a toaster seems like a pretty straightforward, standalone product. However, look a bit closer and it integrates with a huge number of other things. Like sliced bread and all the supply chain behind it. It runs on electricity. Through a standard plug. It sits on a worktop. The spring in the lever that you press down to put the toast on is calibrated to match the strength of your arm.
Your API is a toaster. It doesn't exist in isolation. It's tied into other systems, like your monitoring systems and the way you do migrations and deployment. It's hard to just rip it out, because then you break these integrations and ergonomics — and obviate your own tooling and operational experience.
For example, REST APIs are stateless. We know how to scale them. We know how to debug them. They show up in the browser console. Swapping them out is all very well in theory, but what happens with your new system when it goes down in production?
At Electric, our mission is to make sync and local-first adoptable for mainstream software. So, one of the main challenges we've focused on is how to use Electric with your existing software stack.
This is why we work with any data model in any standard Postgres. It's why we allow you to sync data into anything from a JavaScript object to a local database. And it's why we focus on providing composable primitives rather than a one-size-fits-all solution.
As a result, with Electric, you can develop local-first apps incrementally, using your existing API. So you can get the benefits of local-first, without having to re-engineer your stack or re-invent sliced bread, just to make toast in the morning.
First use Electric to sync data into your app. This allows your app to work with local data without it getting stale.
Then use your API to handle:
As well as, optionally, other concerns like:
Because Electric syncs data over HTTP, you can use existing middleware, integrations and instrumentation. Like authorization services and the browser console.
To build local-first you have to have the data locally. If you're doing that with data fetching then you have a stale data problem. Because if you're working with local data without keeping it in sync, then how do you know that it's not stale?
<figure style="max-width: 512px"> <a :href="NoStaleDataJPG"> </a> </figure>This is why you need data sync. To keep the local data fresh when it changes.
Happily, this is exactly what Electric does. It syncs data into local apps and services and keeps it fresh for you. Practically what does this look like? Well, instead of fetching data using web service calls, i.e.: something like this:
import React, { useState, useEffect } from 'react'
const MyComponent = () => {
const [items, setItems] = useState([])
useEffect(() => {
const fetchItems = async () => {
const response = await fetch('https://example.com/v1/api/items')
const data = await response.json()
setItems(data)
}
fetchItems()
}, [])
return <List items="items" />
}
Sync data using Electric, like this:
import { useShape } from '@electric-sql/react'
const MyComponent = () => {
const { data } = useShape({
url: `https://electric.example.com/v1/shape`,
params: {
table: 'items',
},
})
return <List items="data" />
}
For example:
You can go much further with Electric, all the way to syncing into a local database. But you can do this incrementally as and when you need to.
Electric only does the read-path sync. It syncs data out-of Postgres, into local apps.
Electric does not do write-path sync. It does not provide (or prescribe) a solution for getting data back into Postgres from local apps and services. In fact, it's explicitly designed for you to handle writes yourself.
The other key thing about Electric sync is that it's just JSON over HTTP.
Because it's JSON you can parse it and work with it in any language and environment. Because it's HTTP you can proxy it. Which means you can use existing HTTP services and middleware to authorize access to it.
In fact, whatever you want to do to the replication stream — encrypt, filter, transform, split, remix, buffer, you name it — you can do through a proxy. Extensibility is built in at the protocol layer.
So far, we've seen that Electric handles read-path sync and leaves writes up to you. We've seen how it syncs over HTTP and how this allows you to implement auth and other concerns like encryption and filtering using proxies.
Now, let's now dive in to these aspects and see exactly how to implement them using your existing API. With code samples and links to example apps.
Web-service based apps typically authorize access to resources in a controller or middleware layer. When switching to use a sync engine without an API, you cut out these layers and typically need to codify your auth logic as database rules.
For example in Firebase you have Security Rules that look like this:
service <<name>> {
// Match the resource path.
match <<path>> {
// Allow the request if the following conditions are true.
allow <<methods>> : if <<condition>>
}
}
In Postgres-based systems, like Supabase Realtime you use Postgres Row Level Security (RLS) rules, e.g.:
create policy "Individuals can view their own todos."
on todos for select
using ( (select auth.uid()) = user_id );
With Electric, you don't need to do this. Electric syncs over HTTP. You make HTTP requests to a Shape endpoint (see <a href="/openapi.html#/paths/~1v1~1shape/get" target="_blank">spec here</a>) at:
GET /v1/shape
Because this is an HTTP resource, you can authorize access to it just as you would any other web service resource: using HTTP middleware. Route the request to Electric through an authorizing proxy that you control:
<a :href="AuthorizingProxyJPG"> </a>You can see this pattern implemented in the Proxy auth example.
This defines a proxy that takes an HTTP request, reads the user credentials from an Authorization header, uses them to authorize the request and if successful, proxies the request onto Electric:
<<< @../../examples/proxy-auth/app/shape-proxy/route.ts{typescript}
You can run this kind of proxy as part of your existing backend API. Here's another example, this time using a Plug to authorize requests to a Phoenix application:
<<< @../../examples/gatekeeper-auth/api/lib/api_web/plugs/auth/verify_token.ex{elixir}
If you're running Electric behind a CDN, you're likely to want to deploy your authorizing proxy in front of the CDN. Otherwise routing requests through your API adds latency and can become a bottleneck. You can achieve this by deploying your proxy as an edge function or worker in front of the CDN, for example using Cloudflare Workers or Supabase Edge Functions.
Here's a Supabase edge function using Deno that verifies that the shape definition in a JWT matches the shape definition in the request params:
<<< @../../examples/gatekeeper-auth/edge/index.ts{typescript}
You can also use external authorization services in your proxy.
For example, Authzed is a low-latency, distributed authorization service based on Google Zanzibar. You can use it in an edge proxy to authorize requests in front of a CDN, whilst still ensuring strong consistency for your authorization logic.
import jwt from 'jsonwebtoken'
import { v1 } from '@authzed/authzed-node'
const AUTH_SECRET =
Deno.env.get('AUTH_SECRET') || 'NFL5*0Bc#9U6E@tnmC&E7SUN6GwHfLmY'
const ELECTRIC_URL = Deno.env.get('ELECTRIC_URL') || 'http://localhost:3000'
const HAS_PERMISSION = v1.CheckPermissionResponse_Permissionship.HAS_PERMISSION
function verifyAuthHeader(headers: Headers) {
const auth_header = headers.get('Authorization')
if (auth_header === null) {
return [false, null]
}
const token = auth_header.split('Bearer ')[1]
try {
const claims = jwt.verify(token, AUTH_SECRET, { algorithms: ['HS256'] })
return [true, claims]
} catch (err) {
console.warn(err)
return [false, null]
}
}
Deno.serve(async (req) => {
const url = new URL(req.url)
const [isValidJWT, claims] = verifyAuthHeader(req.headers)
if (!isValidJWT) {
return new Response('Unauthorized', { status: 401 })
}
// See https://github.com/authzed/authzed-node and
// https://authzed.com/docs/spicedb/getting-started/discovering-spicedb
const client = v1.NewClient(claims.token)
const resource = v1.ObjectReference.create({
objectType: `example/table`,
objectId: claims.table,
})
const user = v1.ObjectReference.create({
objectType: 'example/user',
objectId: claims.user_id,
})
const subject = v1.SubjectReference.create({
object: user,
})
const permissionRequest = v1.CheckPermissionRequest.create({
permission: 'read',
resource,
subject,
})
const checkResult = await new Promise((resolve, reject) => {
client.checkPermission(permissionRequest, (err, response) =>
err ? reject(err) : resolve(response)
)
})
if (checkResult.permissionship !== HAS_PERMISSION) {
return new Response('Forbidden', { status: 403 })
}
return fetch(`${ELECTRIC_URL}/v1/shape${url.search}`, {
headers: req.headers,
})
})
Another pattern, illustrated in our gatekeeper-auth example, is to:
This allows you to keep more of your auth logic in your API and minimise what's executed on the "hot path" of the proxy. This is actually what the code example shown in the edge proxy section above does, using an edge worker to validate a shape-scoped auth token.
You can also achieve the same thing using a standard reverse proxy like Caddy, Nginx or Varnish. For example, using Caddy:
<<< @../../examples/gatekeeper-auth/caddy/Caddyfile{hcl}
The workflow from the client's point of view is to first hit the gatekeeper endpoint to generate a shape-scoped auth token, e.g.:
$ curl -sX POST "http://localhost:4000/gatekeeper/items" | jq
{
"headers": {
"Authorization": "Bearer <token>"
},
"url": "http://localhost:4000/proxy/v1/shape",
"table": "items"
}
Then use the token to authorize requests to Electric, via the proxy, e.g.:
$ curl -sv --header "Authorization: Bearer <token>" \
"http://localhost:4000/proxy/v1/shape?table=items&offset=-1"
...
< HTTP/1.1 200 OK
...
The Typescript client supports auth headers and 401 / 403 error handling, so you can wrap this up using, e.g.:
<<< @../../examples/gatekeeper-auth/client/index.ts{ts}
Electric does read-path sync. That's the bit between Postgres and the client in the diagram below. Electric does not handle writes. That's the dashed blue arrows around the outside, back from the client into Postgres:
<figure> <a href="/img/api/shape-log.jpg"> </a> </figure>Instead, Electric is designed for you to implement writes yourself. There's a comprehensive Writes guide and Write patterns example that walks through a range of approaches for this that integrate with your existing API.
You can also see a number of the examples that use an API for writes, including the Linearlite, Phoenix LiveView and Tanstack examples.
To highlight a couple of the key patterns, let's look at the shared API server for the write-patterns example. It is an Express app that exposes the write methods of a REST API for a table of todos:
POST {todo} /todos to create a todoPUT {partial-todo} /todos/:id to updateDELETE /todos/:id to delete<<< @../../examples/write-patterns/shared/backend/api.js{js}
If you then look at the optimistic state pattern (one of the approaches illustrated in the write-patterns example) you can see this being used, together with Electric sync, to support instant, local, offline-capable writes:
<<< @../../examples/write-patterns/patterns/2-optimistic-state/index.tsx{tsx}
You can also see the shared persistent optimistic state pattern for a more resilient, comprehensive approach to building local-first apps with Electric on optimistic state.
Another pattern covered in the Writes guide is through the database sync. This approach uses Electric to sync into an local, embedded database and then syncs changes made to the local database back to Postgres, via your API.
The example implementation uses Electric to sync into PGlite as the local embedded database. All the application code needs to do is read and write to the local database. The database schema takes care of everything else, including keeping a log of local changes to send to the server.
This is then processed by a sync utility that sends data to a:
POST {transactions} /changes endpointImplemented in the shared API server shown above:
<<< @../../examples/write-patterns/patterns/4-through-the-db/sync.ts{ts}
Just as with reads, because you're sending writes to an API endpoint, you can use your API, middleware, or a proxy to authorize them. Just as you would any other API request.
Again, to emphasise, this allows you to develop local-first apps, without having to codify write-path authorization logic into database rules. In fact, in many cases, you can just keep your existing API endpoints and you may not need to change any code at all.
Electric syncs ciphertext as well as it syncs plaintext. You can encrypt data on and off the local client, i.e.:
You can see an example of this in the encryption example:
<<< @../../examples/encryption/src/Example.tsx{tsx}
One of the challenges with encryption is key management. I.e.: choosing which data to encrypt with which keys and sharing the right keys with the right users.
There are some good patterns here like using a key per resource, such as a tenant, workspace or group. You can then encrypt data within that resource using a specific key and share the key with user when they get access to the resource (e.g.: when added to the group).
Electric is good at syncing keys. For example, you could define a shape like:
const stream = new ShapeStream({
url: `${ELECTRIC_URL}/v1/shape`,
params: {
table: 'tenants',
columns: ['keys'],
where: `id in ('${user.tenant_ids.join(`', '`)}')`,
},
})
Either in your client or in your proxy. You could then put a denormalised tenant_id column on all of your rows and lookup the correct key to use when decrypting and encrypting the row.
The HTTP API streams a log of change operations. You can intercept this at any level -- in your API, in a middleware proxy or when handling or materialising the log from a ShapeStream instance in the client.
Because Electric syncs over HTTP, it integrates with standard debugging, visibility and monitoring tools.
You can see Electric requests in your standard HTTP logs. You can catch errors and send them with request-specific context to systems like Sentry and AppSignal.
You can debug on the command line using curl.
One of the most important aspects of this is being able to see and easily introspect sync requests in the browser console. This allows you to see what data is being sent through when and also allows you to observe caching and offline behaviour.
<p style="max-width: 512px"> <a :href="BrowserConsolePNG"> </a> </p>You don't need to implement custom tooling to get visibility in what's happening with Electric. It's not a black box when it comes to debugging in development and in production.
This post has outlined how you can develop local-first software incrementally, using your existing API alongside Electric for read-path sync.
To learn more and get started with Electric, see the Quickstart, Documentation and source code on GitHub:
<div class="actions cta-actions page-footer-actions left"> <div class="action"> <VPButton href="/docs/quickstart" text="Quickstart" theme="electric" /> </div> <div class="action"> <GitHubButton repo="electric-sql/electric" text="Star on GitHub" /> </div> </div>