Back to Eliza

AWS Examples

packages/docs/examples-gallery/aws.mdx

1.7.22.5 KB
Original Source

Deploy AI agents as serverless AWS Lambda functions.

Quick Start

bash
cd examples/aws
export OPENAI_API_KEY="your-key"

# TypeScript
cd typescript && bun run build
sam deploy --guided --parameter-overrides OpenAIApiKey=$OPENAI_API_KEY

Architecture

┌──────────────┐     ┌─────────────────┐     ┌────────────────┐
│  Client      │────▶│  API Gateway    │────▶│  Lambda        │
│              │◀────│  (HTTP API)     │◀────│  (elizaOS)     │
└──────────────┘     └─────────────────┘     └────────────────┘

Available Implementations

LanguageDirectoryRuntime
TypeScriptexamples/aws/typescript/Node.js 20
Pythonexamples/aws/python/Python 3.11
Rustexamples/aws/rust/Custom Runtime

TypeScript Handler

typescript
import { APIGatewayProxyHandler } from "aws-lambda";
import { AgentRuntime } from "@elizaos/core";
import { openaiPlugin } from "@elizaos/plugin-openai";

let runtime: AgentRuntime | null = null;

export const handler: APIGatewayProxyHandler = async (event) => {
  if (!runtime) {
    runtime = new AgentRuntime({
      character: { name: "Eliza", bio: "A helpful AI." },
      plugins: [openaiPlugin],
    });
    await runtime.initialize();
  }

  const { message } = JSON.parse(event.body || "{}");
  const response = await runtime.useModel("TEXT_LARGE", { prompt: message });

  return {
    statusCode: 200,
    body: JSON.stringify({ response: String(response) }),
  };
};

Testing Locally

bash
cd examples/aws/typescript
bun run start  # Starts local server on port 3000
curl -X POST http://localhost:3000/chat \
  -H "Content-Type: application/json" \
  -d '{"message": "Hello!"}'

Configuration

VariableDefaultDescription
OPENAI_API_KEYRequiredOpenAI API key
CHARACTER_NAMEElizaAgent name
CHARACTER_BIOA helpful AI.Agent bio
LOG_LEVELINFOLogging level

Cost Estimate

512MB memory, 2s average duration, 10K requests/month:

  • Requests: $0.002
  • Duration: $0.17
  • Total: ~$0.20/month