packages/docs/guides/create-a-plugin.mdx
20 minutes. That's all it takes to build a plugin that generates AI videos from text prompts. No boilerplate, no complex setup - just code that works.
<Tip> **Video Tutorial**: [**Plugin Power: Add Superpowers to Your Agents**](https://www.youtube.com/watch?v=nC6veN2Q-ps&list=PLrjBjP4nU8ehOgKAa0-XddHzE0KK0nNvS&index=4) </Tip>This guide shows how to build a fal.ai plugin that lets your agent generate 6-second, 768p videos from text prompts using the MiniMax Hailuo-02 model. For architectural concepts, see Plugin Architecture.
You'll learn:
For component details and patterns, see Plugin Components and Plugin Patterns.
Create a project with a plugin inside using CLI commands:
<Steps> <Step title="Create project"> ```bash Terminal elizaos create --type project my-eliza-project ```Configure when prompted:
- **Database**: PgLite (perfect for local development)
- **Model**: OpenAI
```bash Terminal
cd my-eliza-project
```
```typescript src/character.ts
export const character: Character = {
name: 'Eliza',
plugins: [
'@elizaos/plugin-sql',
'@elizaos/plugin-openai',
'./plugin-fal-ai' // [!code ++]
],
};
```
<Note>
The basic-capabilities plugin is automatically included - you don't need to add it manually.
</Note>
```bash Terminal
# Build the plugin first
cd plugin-fal-ai
bun run build
# Go back to project and start
cd ..
elizaos start
```
**Verify it's loaded:**
- Check the console logs for `Successfully loaded plugin 'plugin-fal-ai'`
- Visit `http://localhost:3000` → click your agent → **Plugins tab**
Let's research what we want to build by exploring fal.ai for a good text-to-video model. MiniMax Hailuo-02 Text to Video looks pretty good.
bun add @fal-ai/clientimport { fal } from "@fal-ai/client"fal.subscribe("model-endpoint", { input: {...} }){ data, requestId }Now we know exactly what to build and how to call it, so let's start developing our plugin.
This adds the [fal.ai](https://fal.ai) client package to your plugin dependencies.
```bash Terminal
mkdir src/actions
cp src/plugin.ts src/actions/generateVideo.ts
```
Now let's edit the example plugin into our generateVideo action:
**Add the fal.ai import (from the fal.ai docs):**
```typescript src/actions/generateVideo.ts
import {
Action, ActionResult, IAgentRuntime, Memory, HandlerCallback, HandlerOptions, State, logger
} from '@elizaos/core';
import { fal } from '@fal-ai/client'; // [!code ++]
```
**Update the action identity for video generation:**
```typescript
const quickAction: Action = { // [!code --]
export const generateVideoAction: Action = { // [!code ++]
name: 'QUICK_ACTION', // [!code --]
name: 'TEXT_TO_VIDEO', // [!code ++]
similes: ['GREET', 'SAY_HELLO', 'HELLO_WORLD'], // [!code --]
similes: ['CREATE_VIDEO', 'MAKE_VIDEO', 'GENERATE_VIDEO', 'VIDEO_FROM_TEXT'], // [!code ++]
description: 'Responds with a simple hello world message', // [!code --]
description: 'Generate a video from text using MiniMax Hailuo-02', // [!code ++]
```
**Replace validation with API key check:**
```typescript
validate: async (_runtime, _message, _state) => { // [!code --]
return true; // Always valid // [!code --]
}, // [!code --]
validate: async (runtime: IAgentRuntime, message: Memory) => { // [!code ++]
const falKey = runtime.getSetting('FAL_KEY'); // [!code ++]
if (!falKey) { // [!code ++]
logger.error('FAL_KEY not found in environment variables'); // [!code ++]
return false; // [!code ++]
} // [!code ++]
return true; // [!code ++]
}, // [!code ++]
```
**Replace hello world logic with video generation:**
```typescript
handler: async (_runtime, message, _state, _options, callback) => { // [!code --]
const response = 'Hello world!'; // [!code --]
if (callback) { // [!code --]
await callback({ // [!code --]
text: response, // [!code --]
actions: ['QUICK_ACTION'], // [!code --]
source: message.content.source, // [!code --]
}); // [!code --]
} // [!code --]
return { // [!code --]
text: response, // [!code --]
success: true, // [!code --]
data: { actions: ['QUICK_ACTION'], source: message.content.source } // [!code --]
}; // [!code --]
}, // [!code --]
handler: async ( // [!code ++]
runtime: IAgentRuntime, // [!code ++]
message: Memory, // [!code ++]
_state: State | undefined, // [!code ++]
_options: HandlerOptions | undefined, // [!code ++]
callback?: HandlerCallback // [!code ++]
): Promise<ActionResult> => { // [!code ++]
try { // [!code ++]
fal.config({ credentials: runtime.getSetting('FAL_KEY') }); // [!code ++]
let prompt = message.content.text.replace(/^(create video:|make video:)/i, '').trim(); // [!code ++]
if (!prompt) return { success: false, text: 'I need a description' }; // [!code ++]
const result = await fal.subscribe("fal-ai/minimax/hailuo-02/standard/text-to-video", { // [!code ++]
input: { prompt, duration: "6" }, logs: true // [!code ++]
}); // [!code ++]
const videoUrl = result.data.video.url; // [!code ++]
if (callback) await callback({ text: `✅ Video ready! ${videoUrl}` }); // [!code ++]
return { success: true, text: 'Video generated', data: { videoUrl, prompt } }; // [!code ++]
} catch (error) { // [!code ++]
return { success: false, text: `Failed: ${error.message}` }; // [!code ++]
} // [!code ++]
}, // [!code ++]
```
**Update examples for video conversations:**
```typescript
examples: [ // [!code --]
[{ // [!code --]
name: '{{name1}}', // [!code --]
content: { text: 'Can you say hello?' } // [!code --]
}, { // [!code --]
name: '{{name2}}', // [!code --]
content: { text: 'hello world!', actions: ['QUICK_ACTION'] } // [!code --]
}] // [!code --]
], // [!code --]
examples: [ // [!code ++]
[{ name: '{{user}}', content: { text: 'Create video: dolphins jumping' } }, // [!code ++]
{ name: '{{agent}}', content: { text: 'Creating video!', actions: ['TEXT_TO_VIDEO'] }}] // [!code ++]
], // [!code ++]
};
```
```typescript src/index.ts
import { Plugin } from '@elizaos/core';
import { generateVideoAction } from './actions/generateVideo'; // [!code ++]
export const falaiPlugin: Plugin = { // [!code ++]
name: 'fal-ai', // [!code ++]
description: 'Generate videos using fal.ai MiniMax Hailuo-02', // [!code ++]
actions: [generateVideoAction], // [!code ++]
providers: [], // [!code ++]
services: [] // [!code ++]
}; // [!code ++]
export default falaiPlugin; // [!code ++]
export { generateVideoAction }; // [!code ++]
```
<Note>
You can reference `plugin.ts` as well as other plugins from the [Plugin Registry](/plugin-registry/overview) to see other plugin component examples (providers, services, etc.) as you expand your plugin.
</Note>
```bash .env
PGLITE_DATA_DIR=./.eliza/.elizadb
OPENAI_API_KEY=your_openai_key_here
FAL_KEY=your_fal_key_here # [!code ++]
```
Verify your plugin works as expected:
<Steps> <Step title="Test the updated plugin"> First rebuild your plugin to effect our changes, then start from project root: ```bash Terminal # Build the plugin first cd plugin-fal-ai bun run build# Start from project root
cd ..
elizaos start
```
- `"Create video: dolphins jumping in ocean"`
- `"Make video: cat playing piano"`
- `"Generate video: sunset over mountains"`
You should see the video generation process and get a URL to view the result!
Plugins come default with component and E2E tests. Let's add custom component tests:
<Steps> <Step title="Add a component test"> Update `plugin-fal-ai/src/__tests__/plugin.test.ts`:```typescript src/__tests__/plugin.test.ts
import { describe, it, expect, beforeEach, afterEach } from 'vitest';
import type { IAgentRuntime, Memory } from '@elizaos/core'; // [!code ++]
import { falaiPlugin, generateVideoAction } from '../index'; // [!code ++]
import { createTestRuntime, cleanupRuntime } from './test-utils'; // [!code ++]
describe('FAL AI Plugin', () => {
let runtime: IAgentRuntime; // [!code ++]
beforeEach(async () => { // [!code ++]
runtime = await createTestRuntime(); // [!code ++]
}); // [!code ++]
afterEach(async () => { // [!code ++]
await cleanupRuntime(runtime); // [!code ++]
}); // [!code ++]
it('action validates with FAL_KEY', async () => { // [!code ++]
// Set FAL_KEY in test runtime
runtime.setSetting('FAL_KEY', 'test-key'); // [!code ++]
const message = { content: { text: 'test' } } as Memory; // [!code ++]
const isValid = await generateVideoAction.validate( // [!code ++]
runtime, // [!code ++]
message // [!code ++]
); // [!code ++]
expect(isValid).toBe(true); // [!code ++]
}); // [!code ++]
});
```
Let's also add a custom E2E test:
<Steps> <Step title="Add an E2E test"> Update `src/__tests__/e2e/plugin-fal-ai.e2e.ts`:```typescript src/__tests__/e2e/plugin-fal-ai.e2e.ts
export const FalAiTestSuite = { // [!code ++]
name: 'fal-ai-video-generation', // [!code ++]
tests: [{ // [!code ++]
name: 'should find video action in runtime', // [!code ++]
fn: async (runtime) => { // [!code ++]
const action = runtime.actions.find(a => a.name === 'TEXT_TO_VIDEO'); // [!code ++]
if (!action) throw new Error('TEXT_TO_VIDEO action not found'); // [!code ++]
} // [!code ++]
}] // [!code ++]
}; // [!code ++]
```
Congratulations! You now have a working video generation plugin. Here are some ways you can improve it:
The possibilities are endless!