Back to Trigger

Next.js Batch LLM Evaluator

docs/guides/example-projects/batch-llm-evaluator.mdx

4.4.52.8 KB
Original Source

import RealtimeLearnMore from "/snippets/realtime-learn-more.mdx";

Overview

This demo is a full stack example that uses the following:

  • A Next.js app with Prisma for the database.
  • Trigger.dev Realtime to stream updates to the frontend.
  • Work with multiple LLM models using the Vercel AI SDK. (OpenAI, Anthropic, XAI)
  • Distribute tasks across multiple tasks using the new batch.triggerByTaskAndWait method.

GitHub repo

<Card title="View the Batch LLM Evaluator repo" icon="GitHub" href="https://github.com/triggerdotdev/examples/tree/main/batch-llm-evaluator"

Click here to view the full code for this project in our examples repository on GitHub. You can fork it and use it as a starting point for your own project. </Card>

Video

<video controls className="w-full aspect-video" src="https://content.trigger.dev/batch-llm-evaluator.mp4"

</video>

Relevant code

<Note> This example uses the older `useRealtimeRunWithStreams` hook. For new projects, consider using the new [`useRealtimeStream`](/realtime/react-hooks/streams#userealtimestream-recommended) hook (SDK 4.1.0+) for a simpler API and better type safety with defined streams. </Note> <RealtimeLearnMore />