content/providers/03-community-providers/33-portkey.mdx
Portkey natively integrates with the AI SDK to make your apps production-ready and reliable. Import Portkey's Vercel package and use it as a provider in your Vercel AI app to enable all of Portkey's features:
Learn more at Portkey docs for the AI SDK
The Portkey provider is available in the @portkey-ai/vercel-provider module. You can install it with:
<Tabs items={['pnpm', 'npm', 'yarn', 'bun']}> <Tab> <Snippet text="pnpm add @portkey-ai/vercel-provider" dark /> </Tab> <Tab> <Snippet text="npm install @portkey-ai/vercel-provider" dark /> </Tab> <Tab> <Snippet text="yarn add @portkey-ai/vercel-provider" dark /> </Tab>
<Tab> <Snippet text="bun add @portkey-ai/vercel-provider" dark /> </Tab> </Tabs>To create a Portkey provider instance, use the createPortkey function:
import { createPortkey } from '@portkey-ai/vercel-provider';
const portkeyConfig = {
provider: 'openai', //enter provider of choice
api_key: 'OPENAI_API_KEY', //enter the respective provider's api key
override_params: {
model: 'gpt-5', //choose from 250+ LLMs
},
};
const portkey = createPortkey({
apiKey: 'YOUR_PORTKEY_API_KEY',
config: portkeyConfig,
});
You can find your Portkey API key in the Portkey Dashboard.
Portkey supports both chat and completion models. Use portkey.chatModel() for chat models and portkey.completionModel() for completion models:
const chatModel = portkey.chatModel('');
const completionModel = portkey.completionModel('');
Note: You can provide an empty string as the model name if you've defined it in the portkeyConfig.
You can use Portkey language models with the generateText or streamText function:
generateTextimport { createPortkey } from '@portkey-ai/vercel-provider';
import { generateText } from 'ai';
const portkey = createPortkey({
apiKey: 'YOUR_PORTKEY_API_KEY',
config: portkeyConfig,
});
const { text } = await generateText({
model: portkey.chatModel(''),
prompt: 'What is Portkey?',
});
console.log(text);
streamTextimport { createPortkey } from '@portkey-ai/vercel-provider';
import { streamText } from 'ai';
const portkey = createPortkey({
apiKey: 'YOUR_PORTKEY_API_KEY',
config: portkeyConfig,
});
const result = streamText({
model: portkey.completionModel(''),
prompt: 'Invent a new holiday and describe its traditions.',
});
for await (const chunk of result) {
console.log(chunk);
}
Note:
Tool use with the AI SDKOutput is currently not supported.Portkey offers several advanced features to enhance your AI applications:
Interoperability: Easily switch between 250+ AI models by changing the provider and model name in your configuration.
Observability: Access comprehensive analytics and logs for all your requests.
Reliability: Implement caching, fallbacks, load balancing, and conditional routing.
Guardrails: Enforce LLM behavior in real-time with input and output checks.
Security and Compliance: Set budget limits and implement fine-grained user roles and permissions.
For detailed information on these features and advanced configuration options, please refer to the Portkey documentation.