docs/src/content/en/guides/deployment/vercel.mdx
import Steps from "@site/src/components/Steps"; import StepItem from "@site/src/components/StepItem";
Use @mastra/deployer-vercel to deploy your Mastra server as serverless functions on Vercel. The deployer bundles your code and generates a .vercel/output directory conforming to Vercel's Build Output API, ready to deploy with no additional configuration.
:::info This guide covers deploying the Mastra server. If you're using a server adapter or web framework, deploy the way you normally would for that framework. :::
You'll need a Mastra application and a Vercel account.
:::warning Vercel Functions use an ephemeral filesystem, so any storage you configure (including observability storage) must be hosted externally. If you're using LibSQLStore with a file URL, switch to a remotely hosted database. :::
Add the @mastra/deployer-vercel package to your project:
npm install @mastra/deployer-vercel@latest
Import VercelDeployer and set it as the deployer in your Mastra configuration:
import { Mastra } from '@mastra/core'
// highlight-next-line
import { VercelDeployer } from '@mastra/deployer-vercel'
export const mastra = new Mastra({
// highlight-next-line
deployer: new VercelDeployer(),
})
By default, Vercel runs `npm run build`, which triggers `mastra build`. If you don't have a build script, add `"build": "mastra build"` to your `package.json`.
:::note
Remember to set your environment variables needed to run your application (e.g. your [model provider](/models/providers) API key).
:::
Since the Mastra server prefixes every endpoint with `/api`, you have to add it to your URLs when making requests.
:::warning
Set up [authentication](/docs/server/auth) before exposing your endpoints publicly.
:::
You can deploy Studio alongside your API by enabling the studio option. Studio is deployed as static assets served from Vercel's Edge CDN, so it doesn't consume function invocations.
import { Mastra } from '@mastra/core'
import { VercelDeployer } from '@mastra/deployer-vercel'
export const mastra = new Mastra({
deployer: new VercelDeployer({
// highlight-next-line
studio: true,
}),
})
After deploying, Studio is available at the root URL (https://<your-project>.vercel.app/) and the API remains at /api/*. Studio automatically connects to the API on the same origin — no additional environment variables are needed.
:::warning Once Studio is connected to your Mastra server, it has full access to your agents, workflows, and tools. Be sure to secure it properly in production (e.g. behind authentication, VPN, etc.) to prevent unauthorized access. :::
The Vercel deployer supports configuration options that are written to the Vercel Output API function config. See the VercelDeployer reference for available options like maxDuration, memory, and regions.
Serverless functions can terminate immediately after returning a response. Any pending async work - like sending telemetry - may get killed before completing. Awaiting flush() ensures all traces are sent before the function exits.
import type { VercelRequest, VercelResponse } from '@vercel/node'
import { mastra } from '../src/mastra'
export default async function handler(req: VercelRequest, res: VercelResponse) {
const { message } = req.body
const agent = mastra.getAgent('myAgent')
const result = await agent.generate([{ role: 'user', content: message }])
const observability = mastra.getObservability()
await observability?.flush()
return res.json(result)
}
:::warning
The Vercel deployer doesn't include flush calls. If you need this, you'll need to wrap the handler yourself to add the logic before returning the response. Alternatively, deploy to a long-running server like a virtual machine where this isn't an issue.
:::