Back to Remotion

Server-Side Rendering

packages/docs/docs/ssr.mdx

4.0.4603.5 KB
Original Source

import {TableOfContents} from './renderer/TableOfContents';

Remotion's rendering engine is built with Node.JS, which makes it easy to render a video in the cloud.

See the Comparison of SSR options to help you decide which option is best for you.

Render a video on AWS Lambda

The fastest way to render videos in the cloud is to use @remotion/lambda.

Render using Vercel Sandbox

The easiest way, especially for Vercel customers is to use Vercel Sandbox.

Render a video using Node.js APIs

We provide a set of APIs to render videos using Node.js and Bun.
See an example or the API reference for more information.

Render using GitHub Actions

You can render a video on GitHub actions. The following workflow assumes a composition ID of MyComp

yaml
name: Render video
on:
  workflow_dispatch:
jobs:
  render:
    name: Render video
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@main
      - uses: actions/setup-node@main
      - run: npm i
      - run: npx remotion render MyComp out/video.mp4
      - uses: actions/upload-artifact@v4
        with:
          name: out.mp4
          path: out/video.mp4

With input props

If you have props, you can ask for them using the GitHub Actions input fields.
Here we assume a shape of {titleText: string; titleColor: string}.

yaml
name: Render video
on:
  workflow_dispatch:
    inputs:
      titleText:
        description: 'Which text should it say?'
        required: true
        default: 'Welcome to Remotion'
      titleColor:
        description: 'Which color should it be in?'
        required: true
        default: 'black'
jobs:
  render:
    name: Render video
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@main
      - uses: actions/setup-node@main
      - run: npm i
      - run: echo $WORKFLOW_INPUT > input-props.json
        env:
          WORKFLOW_INPUT: ${{ toJson(github.event.inputs) }}
      - run: npx remotion render MyComp out/video.mp4 --props="./input-props.json"
      - uses: actions/upload-artifact@v4
        with:
          name: out.mp4
          path: out/video.mp4

<Step>1</Step> Commit the template to a GitHub repository.
<Step>2</Step> On GitHub, click the <code>Actions</code> tab.
<Step>3</Step> Select the <code>Render video</code> workflow on the left.
<Step>4</Step> A <code>Run workflow</code> button should appear. Click it.
<Step>5</Step> Fill in the props of the root component and click <code>Run workflow</code>.

<Step>6</Step> After the rendering is finished, you can download the video under <code>Artifacts</code>.

Note that running the workflow may incur costs. However, the workflow will only run if you actively trigger it.

See also: Passing input props in GitHub Actions

Render a video using Docker

See: Dockerizing a Remotion project

Deploy to a cloud platform

The following guides show how to deploy a Remotion rendering service using the Node.js/Bun APIs on various cloud platforms:

Render a video using GCP Cloud Run (Alpha)

Check out the experimental Cloud Run package.
Note: It not actively being developed - our plan is to port the Lambda runtime to Cloud Run instead of maintaining a separate implementation.

API reference

<TableOfContents />