content/cookbook/15-api-servers/10-node-http-server.mdx
You can use the AI SDK in a Node.js HTTP server to generate text and stream it to the client.
The examples start a simple HTTP server that listens on port 8080. You can e.g. test it using curl:
curl -X POST http://localhost:8080
Full example: github.com/vercel/ai/examples/node-http-server
You can use the pipeUIMessageStreamToResponse method to pipe the stream data to the server response.
import { streamText } from 'ai';
import { createServer } from 'http';
createServer(async (req, res) => {
const result = streamText({
model: 'openai/gpt-4o',
prompt: 'Invent a new holiday and describe its traditions.',
});
result.pipeUIMessageStreamToResponse(res);
}).listen(8080);
createUIMessageStream and pipeUIMessageStreamToResponse can be used to send custom data to the client.
import {
createUIMessageStream,
pipeUIMessageStreamToResponse,
streamText,
} from 'ai';
import { createServer } from 'http';
createServer(async (req, res) => {
switch (req.url) {
case '/stream-data': {
const stream = createUIMessageStream({
execute: ({ writer }) => {
// write some custom data
writer.write({ type: 'start' });
writer.write({
type: 'data-custom',
data: {
custom: 'Hello, world!',
},
});
const result = streamText({
model: 'openai/gpt-4o',
prompt: 'Invent a new holiday and describe its traditions.',
});
writer.merge(
result.toUIMessageStream({
sendStart: false,
onError: error => {
// Error messages are masked by default for security reasons.
// If you want to expose the error message to the client, you can do so here:
return error instanceof Error ? error.message : String(error);
},
}),
);
},
});
pipeUIMessageStreamToResponse({ stream, response: res });
break;
}
}
}).listen(8080);
You can send a text stream to the client using pipeTextStreamToResponse.
import { streamText } from 'ai';
import { createServer } from 'http';
createServer(async (req, res) => {
const result = streamText({
model: 'openai/gpt-4o',
prompt: 'Invent a new holiday and describe its traditions.',
});
result.pipeTextStreamToResponse(res);
}).listen(8080);