www/docs/client/links/httpBatchStreamLink.md
httpBatchStreamLink is a terminating link that batches an array of individual tRPC operations into a single HTTP request that's sent to a single tRPC procedure (equivalent to httpBatchLink), but doesn't wait for all the responses of the batch to be ready and streams the responses as soon as any data is available.
Options are identical to httpBatchLink options, with the following addition:
| Option | Type | Default | Description |
|---|---|---|---|
streamHeader | 'trpc-accept' | 'accept' | 'trpc-accept' | Which header to use to signal the server that the client wants a streaming response. 'accept' uses the standard Accept header instead of the custom trpc-accept header, which can avoid CORS preflight for cross-origin streaming queries since Accept is a CORS-safelisted header. |
All usage and options are identical to
httpBatchLink.
:::note
If you require the ability to change/set response headers (which includes cookies) from within your procedures, make sure to use httpBatchLink instead! This is due to the fact that httpBatchStreamLink does not support setting headers once the stream has begun. Read more.
:::
You can import and add the httpBatchStreamLink to the links array as such:
// @filename: server.ts
import { initTRPC } from '@trpc/server';
const t = initTRPC.create();
export const appRouter = t.router({});
export type AppRouter = typeof appRouter;
// @filename: client.ts
// ---cut---
import { createTRPCClient, httpBatchStreamLink } from '@trpc/client';
import type { AppRouter } from './server';
const client = createTRPCClient<AppRouter>({
links: [
httpBatchStreamLink({
url: 'http://localhost:3000',
}),
],
});
After that, you can make use of batching by setting all your procedures in a Promise.all. The code below will produce exactly one HTTP request and on the server exactly one database query:
// @target: esnext
// @filename: server.ts
import { initTRPC } from '@trpc/server';
import { z } from 'zod';
const t = initTRPC.create();
export const appRouter = t.router({
post: t.router({
byId: t.procedure.input(z.number()).query(({ input }) => ({ id: input, title: `Post ${input}` })),
}),
});
export type AppRouter = typeof appRouter;
// @filename: client.ts
import { createTRPCClient, httpBatchStreamLink } from '@trpc/client';
import type { AppRouter } from './server';
const trpc = createTRPCClient<AppRouter>({ links: [httpBatchStreamLink({ url: 'http://localhost:3000' })] });
// ---cut---
const somePosts = await Promise.all([
trpc.post.byId.query(1),
trpc.post.byId.query(2),
trpc.post.byId.query(3),
]);
When batching requests together, the behavior of a regular httpBatchLink is to wait for all requests to finish before sending the response. If you want to send responses as soon as they are ready, you can use httpBatchStreamLink instead. This is useful for long-running requests.
// @filename: server.ts
import { initTRPC } from '@trpc/server';
const t = initTRPC.create();
export const appRouter = t.router({});
export type AppRouter = typeof appRouter;
// @filename: client.ts
// ---cut---
import { createTRPCClient, httpBatchStreamLink } from '@trpc/client';
import type { AppRouter } from './server';
const client = createTRPCClient<AppRouter>({
links: [
httpBatchStreamLink({
url: 'http://localhost:3000',
}),
],
});
Compared to a regular httpBatchLink, a httpBatchStreamLink will:
trpc-accept: application/jsonl header (or Accept: application/jsonl when using streamHeader: 'accept')transfer-encoding: chunked and content-type: application/jsonldata key from the argument object passed to responseMeta (because with a streamed response, the headers are sent before the data is available)You can try this out on the homepage of tRPC.io: https://trpc.io/?try=minimal#try-it-out
// @target: esnext
// @filename: trpc.ts
import { initTRPC } from '@trpc/server';
const t = initTRPC.create({});
export const router = t.router;
export const publicProcedure = t.procedure;
// @filename: server.ts
// ---cut---
import { publicProcedure, router } from './trpc';
const appRouter = router({
examples: {
iterable: publicProcedure.query(async function* () {
for (let i = 0; i < 3; i++) {
await new Promise((resolve) => setTimeout(resolve, 500));
yield i;
}
}),
},
});
export type AppRouter = typeof appRouter;
// @filename: client.ts
// ---cut---
import { createTRPCClient, httpBatchStreamLink } from '@trpc/client';
import type { AppRouter } from './server';
const trpc = createTRPCClient<AppRouter>({
links: [
httpBatchStreamLink({
url: 'http://localhost:3000',
}),
],
});
const iterable = await trpc.examples.iterable.query();
// ^?
for await (const value of iterable) {
console.log('Iterable:', value);
// ^?
}
Browser support should be identical to fetch support.
For runtimes other than the browser ones, the fetch implementation should support streaming, meaning that the response obtained by await fetch(...) should have a body property of type ReadableStream<Uint8Array> | NodeJS.ReadableStream, meaning that:
response.body.getReader is a function that returns a ReadableStreamDefaultReader<Uint8Array> objectresponse.body is a Uint8Array BufferThis includes support for undici, node-fetch, native Node.js fetch implementation, and WebAPI fetch implementation (browsers).
Receiving the stream relies on the TextDecoder and TextDecoderStream APIs, which are not available in React Native. It's important to note that if your TextDecoderStream polyfill does not automatically polyfill ReadableStream and WritableStream those will also need to be polyfilled. If you still want to enable streaming, you need to polyfill those.
You will also need to override the default fetch in the httpBatchStreamLink configuration options. In the below example we will be using the Expo fetch package for the fetch implementation.
import { httpBatchStreamLink } from '@trpc/client';
httpBatchStreamLink({
fetch: (url, opts) =>
fetch(url, {
...opts,
reactNative: { textStreaming: true },
}),
url: 'http://localhost:3000',
});
:::caution AWS Lambda
httpBatchStreamLink only supported on AWS Lambda when your infrastructure is set up for streaming responses. If not this Link will simply behave like a regular httpBatchLink.
:::
:::caution Cloudflare Workers
You need to enable the ReadableStream API through a feature flag: streams_enable_constructors.
:::
You can check out the source code for this link on GitHub.
When setting up your root config, you can pass in a jsonl option to configure a ping option to keep the connection alive.
import { initTRPC } from '@trpc/server';
const t = initTRPC.create({
jsonl: {
pingMs: 1000,
},
});