Back to Ai

Portkey

content/providers/03-community-providers/33-portkey.mdx

2.1.104.2 KB
Original Source

Portkey Provider

Portkey natively integrates with the AI SDK to make your apps production-ready and reliable. Import Portkey's Vercel package and use it as a provider in your Vercel AI app to enable all of Portkey's features:

  • Full-stack observability and tracing for all requests
  • Interoperability across 250+ LLMs
  • Built-in 50+ state-of-the-art guardrails
  • Simple & semantic caching to save costs & time
  • Conditional request routing with fallbacks, load-balancing, automatic retries, and more
  • Continuous improvement based on user feedback

Learn more at Portkey docs for the AI SDK

Setup

The Portkey provider is available in the @portkey-ai/vercel-provider module. You can install it with:

<Tabs items={['pnpm', 'npm', 'yarn', 'bun']}> <Tab> <Snippet text="pnpm add @portkey-ai/vercel-provider" dark /> </Tab> <Tab> <Snippet text="npm install @portkey-ai/vercel-provider" dark /> </Tab> <Tab> <Snippet text="yarn add @portkey-ai/vercel-provider" dark /> </Tab>

<Tab> <Snippet text="bun add @portkey-ai/vercel-provider" dark /> </Tab> </Tabs>

Provider Instance

To create a Portkey provider instance, use the createPortkey function:

typescript
import { createPortkey } from '@portkey-ai/vercel-provider';

const portkeyConfig = {
  provider: 'openai', //enter provider of choice
  api_key: 'OPENAI_API_KEY', //enter the respective provider's api key
  override_params: {
    model: 'gpt-5', //choose from 250+ LLMs
  },
};

const portkey = createPortkey({
  apiKey: 'YOUR_PORTKEY_API_KEY',
  config: portkeyConfig,
});

You can find your Portkey API key in the Portkey Dashboard.

Language Models

Portkey supports both chat and completion models. Use portkey.chatModel() for chat models and portkey.completionModel() for completion models:

typescript
const chatModel = portkey.chatModel('');
const completionModel = portkey.completionModel('');

Note: You can provide an empty string as the model name if you've defined it in the portkeyConfig.

Examples

You can use Portkey language models with the generateText or streamText function:

generateText

javascript
import { createPortkey } from '@portkey-ai/vercel-provider';
import { generateText } from 'ai';

const portkey = createPortkey({
  apiKey: 'YOUR_PORTKEY_API_KEY',
  config: portkeyConfig,
});

const { text } = await generateText({
  model: portkey.chatModel(''),
  prompt: 'What is Portkey?',
});

console.log(text);

streamText

javascript
import { createPortkey } from '@portkey-ai/vercel-provider';
import { streamText } from 'ai';

const portkey = createPortkey({
  apiKey: 'YOUR_PORTKEY_API_KEY',
  config: portkeyConfig,
});

const result = streamText({
  model: portkey.completionModel(''),
  prompt: 'Invent a new holiday and describe its traditions.',
});

for await (const chunk of result) {
  console.log(chunk);
}

Note:

  • Portkey supports Tool use with the AI SDK
  • Structured output with Output is currently not supported.

Advanced Features

Portkey offers several advanced features to enhance your AI applications:

  1. Interoperability: Easily switch between 250+ AI models by changing the provider and model name in your configuration.

  2. Observability: Access comprehensive analytics and logs for all your requests.

  3. Reliability: Implement caching, fallbacks, load balancing, and conditional routing.

  4. Guardrails: Enforce LLM behavior in real-time with input and output checks.

  5. Security and Compliance: Set budget limits and implement fine-grained user roles and permissions.

For detailed information on these features and advanced configuration options, please refer to the Portkey documentation.

Additional Resources