packages/open-responses/README.md
The Open Responses provider for the AI SDK contains language model support for Open Responses compatible APIs.
The Open Responses provider is available in the @ai-sdk/open-responses module. You can install it with
npm i @ai-sdk/open-responses
If you use coding agents such as Claude Code or Cursor, we highly recommend adding the AI SDK skill to your repository:
npx skills add vercel/ai
Create an Open Responses provider instance using createOpenResponses:
import { createOpenResponses } from '@ai-sdk/open-responses';
const openResponses = createOpenResponses({
name: 'aProvider',
url: 'http://localhost:1234/v1/responses',
});
You can use this instance to access models served by any Open Responses compatible endpoint.
import { createOpenResponses } from '@ai-sdk/open-responses';
import { generateText } from 'ai';
const openResponses = createOpenResponses({
name: 'aProvider',
url: 'http://localhost:1234/v1/responses',
});
const { text } = await generateText({
model: openResponses('mistralai/ministral-3-14b-reasoning'),
prompt: 'Invent a new holiday and describe its traditions.',
maxOutputTokens: 100,
});
Please check out the Open Responses provider documentation for more information.