content/docs/07-reference/01-ai-sdk-core/18-create-agent-ui-stream-response.mdx
createAgentUIStreamResponseThe createAgentUIStreamResponse function executes an Agent, runs its streaming output as a UI message stream, and returns an HTTP Response object whose body is the live, streaming UI message output. This is designed for API routes that deliver real-time agent results, such as chat endpoints or streaming tool-use operations.
<Snippet
text={import { createAgentUIStreamResponse } from "ai"}
prompt={false}
/>
import { ToolLoopAgent, createAgentUIStreamResponse } from 'ai';
__PROVIDER_IMPORT__;
const agent = new ToolLoopAgent({
model: __MODEL__,
instructions: 'You are a helpful assistant.',
tools: { weather: weatherTool, calculator: calculatorTool },
});
export async function POST(request: Request) {
const { messages } = await request.json();
// Optional: support cancellation (aborts on disconnect, etc.)
const abortController = new AbortController();
return createAgentUIStreamResponse({
agent,
uiMessages: messages,
abortSignal: abortController.signal, // optional
// ...other UIMessageStreamOptions like sendSources, experimental_transform, etc.
});
}
<PropertiesTable
content={[
{
name: 'agent',
type: 'Agent',
isRequired: true,
description:
'The agent instance to stream responses from. Must implement .stream({ prompt, ... }) and define the tools property.',
},
{
name: 'uiMessages',
type: 'unknown[]',
isRequired: true,
description:
'Array of input UI messages provided to the agent (e.g., user and assistant messages).',
},
{
name: 'abortSignal',
type: 'AbortSignal',
isRequired: false,
description:
'Optional abort signal to cancel streaming, e.g., on client disconnect. This should be an AbortSignal instance.',
},
{
name: 'timeout',
type: 'number | { totalMs?: number }',
isRequired: false,
description:
'Timeout in milliseconds. Can be specified as a number or as an object with a totalMs property. The call will be aborted if it takes longer than the specified timeout. Can be used alongside abortSignal.',
},
{
name: 'options',
type: 'CALL_OPTIONS',
isRequired: false,
description:
'Optional agent call options, for agents with generic parameter CALL_OPTIONS.',
},
{
name: 'experimental_transform',
type: 'StreamTextTransform | StreamTextTransform[]',
isRequired: false,
description:
'Optional stream transforms to post-process text output—the same as in lower-level streaming APIs.',
},
{
name: 'onStepFinish',
type: 'ToolLoopAgentOnStepFinishCallback',
isRequired: false,
description:
'Callback invoked after each agent step (LLM/tool call) completes. Useful for tracking token usage or logging intermediate steps.',
},
{
name: '...UIMessageStreamOptions',
type: 'UIMessageStreamOptions',
isRequired: false,
description:
'Other UI message output options—such as sendSources and more.',
},
{
name: 'headers',
type: 'HeadersInit',
isRequired: false,
description: 'Optional HTTP headers to include in the Response object.',
},
{
name: 'status',
type: 'number',
isRequired: false,
description: 'Optional HTTP status code.',
},
{
name: 'statusText',
type: 'string',
isRequired: false,
description: 'Optional HTTP status text.',
},
{
name: 'consumeSseStream',
type: '(options: { stream: ReadableStream<string> }) => PromiseLike<void> | void',
isRequired: false,
description:
'Optional function to consume the SSE stream. When provided, this function will be called with the SSE stream to handle consumption.',
},
]}
/>
A Promise<Response> whose body is a streaming UI message output from the agent. Use this as the return value of API/server handlers in serverless, Next.js, Express, Hono, or edge runtime contexts.
import { createAgentUIStreamResponse } from 'ai';
import { MyCustomAgent } from '@/agent/my-custom-agent';
export async function POST(request: Request) {
const { messages } = await request.json();
return createAgentUIStreamResponse({
agent: MyCustomAgent,
uiMessages: messages,
sendSources: true, // (optional)
// headers, status, abortSignal, and other UIMessageStreamOptions also supported
});
}
uiMessages array according to the agent's specified tools and requirements..stream({ prompt, ... }) to get a stream of chunks (steps/UI messages).Response object that streams UI message chunks to the client..stream({ prompt, ... }) and define a tools property (even if it's just {}) to work with this function.headers, status, UI stream options, transforms, etc.) are available for advanced scenarios.