content/providers/05-observability/scorecard.mdx
Scorecard is an observability platform for monitoring and evaluating LLM applications. After integrating with the AI SDK, you can use Scorecard to trace, monitor, and analyze your LLM providers, prompts, and application flows.
Scorecard supports AI SDK telemetry data. You'll need to sign up at https://app.scorecard.io and get your API Key from your settings page.
To use the AI SDK to send telemetry data to Scorecard, first set these environment variables in your project:
OTEL_EXPORTER_OTLP_ENDPOINT=https://tracing.scorecard.io/otel/v1/traces
OTEL_EXPORTER_OTLP_HEADERS="Authorization=Bearer <Your Scorecard API Key>"
Next, create an instrumentation.ts file in your project root to initialize OpenTelemetry (You can configure it as needed):
import { registerOTel } from '@vercel/otel';
export function register() {
registerOTel({
serviceName: 'my-service-name',
});
}
You can then use the experimental_telemetry option to enable telemetry on supported AI SDK function calls:
import { openai } from '@ai-sdk/openai';
import { generateText } from 'ai';
const result = await generateText({
model: openai('gpt-4o-mini'),
prompt: 'Tell me a joke',
experimental_telemetry: { isEnabled: true },
});
After integrating, you'll be able to view in your Scorecard dashboard: