client-sdks/ai-sdk/README.md
The recommended way of using Mastra and AI SDK together is by installing the @mastra/ai-sdk package. @mastra/ai-sdk provides custom API routes and utilities for streaming Mastra agents in AI SDK-compatible formats. Including chat, workflow, and network route handlers, along with utilities and exported types for UI integrations.
npm install @mastra/ai-sdk
If you want to use dynamic agents you can use a path with :agentId.
import { chatRoute } from '@mastra/ai-sdk';
export const mastra = new Mastra({
server: {
apiRoutes: [
chatRoute({
path: '/chat/:agentId',
}),
],
},
});
Or you can create a fixed route (i.e. /chat):
import { chatRoute } from '@mastra/ai-sdk';
export const mastra = new Mastra({
server: {
apiRoutes: [
chatRoute({
path: '/chat',
agent: 'weatherAgent',
}),
],
},
});
After defining a dynamic route with :agentId you can use the useChat() hook like so:
type MyMessage = {};
const { error, status, sendMessage, messages, regenerate, stop } = useChat<MyMessage>({
transport: new DefaultChatTransport({
api: 'http://localhost:4111/chat/weatherAgent',
}),
});
chatRoute() forwards the incoming request AbortSignal to agent.stream(). If the client disconnects, Mastra aborts the in-flight generation. If you need generation to continue and persist server-side after disconnect, build a custom route around agent.stream(), avoid passing the request signal, and call consumeStream() on the returned MastraModelOutput.
Stream a workflow in AI SDK-compatible format.
import { workflowRoute } from '@mastra/ai-sdk';
export const mastra = new Mastra({
server: {
apiRoutes: [
workflowRoute({
path: '/workflow',
agent: 'weatherAgent',
}),
],
},
});
Stream agent networks (routing + nested agent/workflow/tool executions) in AI SDK-compatible format.
import { networkRoute } from '@mastra/ai-sdk';
export const mastra = new Mastra({
server: {
apiRoutes: [
networkRoute({
path: '/network',
agent: 'weatherAgent',
}),
],
},
});
For use outside the Mastra server (e.g., Next.js App Router, Express), you can use the standalone handler functions directly. These handlers return a compatibility ReadableStream that can be passed to AI SDK response helpers like createUIMessageStreamResponse and pipeUIMessageStreamToResponse:
import { handleChatStream } from '@mastra/ai-sdk';
import { createUIMessageStreamResponse } from 'ai';
import { mastra } from '@/src/mastra';
export async function POST(req: Request) {
const params = await req.json();
const stream = await handleChatStream({
mastra,
agentId: 'weatherAgent',
params,
});
return createUIMessageStreamResponse({ stream });
}
import { handleWorkflowStream } from '@mastra/ai-sdk';
import { createUIMessageStreamResponse } from 'ai';
import { mastra } from '@/src/mastra';
export async function POST(req: Request) {
const params = await req.json();
const stream = await handleWorkflowStream({
mastra,
workflowId: 'myWorkflow',
params,
});
return createUIMessageStreamResponse({ stream });
}
Pass AI SDK UIMessage[] from your installed ai version so TypeScript can infer the correct stream overload.
Handlers keep the existing v5/default behavior. If your app is typed against ai@6, pass version: 'v6'.
import { handleNetworkStream } from '@mastra/ai-sdk';
import { createUIMessageStreamResponse, type UIMessage } from 'ai';
import { mastra } from '@/src/mastra';
export async function POST(req: Request) {
const params = (await req.json()) as { messages: UIMessage[] };
const stream = await handleNetworkStream({
mastra,
agentId: 'routingAgent',
version: 'v6',
params,
});
return createUIMessageStreamResponse({ stream });
}
If you have a raw Mastra stream, you can manually transform it to AI SDK UI message parts:
Use toAISdkStream for both versions. If your app is typed against ai@6, pass version: 'v6'.
import { toAISdkStream } from '@mastra/ai-sdk';
import { createUIMessageStream, createUIMessageStreamResponse } from 'ai';
export async function POST(req: Request) {
const { messages } = await req.json();
const agent = mastra.getAgent('weatherAgent');
const stream = await agent.stream(messages);
// deduplicate messages https://ai-sdk.dev/docs/troubleshooting/repeated-assistant-messages
const uiMessageStream = createUIMessageStream({
originalMessages: messages,
execute: async ({ writer }) => {
for await (const part of toAISdkStream(stream, { from: 'agent' })) {
writer.write(part);
}
},
});
return createUIMessageStreamResponse({ stream: uiMessageStream });
}
For AI SDK v6, select the v6 stream contract explicitly:
const uiMessageStream = createUIMessageStream({
originalMessages: messages,
execute: async ({ writer }) => {
for await (const part of toAISdkStream(stream, {
from: 'agent',
version: 'v6',
})) {
writer.write(part);
}
},
});