Back to Lobehub

Using Vercel AI Gateway in LobeHub

docs/usage/providers/vercel-ai-gateway.mdx

2.1.562.0 KB
Original Source

Using Vercel AI Gateway in LobeHub

Vercel AI Gateway is a unified API that provides access to over 100 AI models through a single endpoint. It offers features such as budget management, usage monitoring, load balancing, and fallback handling.

This guide will walk you through how to use Vercel AI Gateway within LobeHub.

<Steps> ### Step 1: Create an API Key in Vercel AI Gateway
  • Go to the Vercel Dashboard
  • Click on the AI Gateway tab on the left sidebar
  • Select API Keys from the sidebar
  • Click Create Key, then confirm by clicking Create Key in the dialog

Step 2: Configure Vercel AI Gateway in LobeHub

  • Navigate to the Settings page in LobeHub
  • Under AI Service Provider, find the Vercel AI Gateway section
  • Enter the API key you obtained
  • Choose a model from Vercel AI Gateway and start chatting with the AI assistant

<Callout type={'warning'}> You may incur charges from the API service provider during usage. Please refer to the Vercel AI Gateway pricing policy for details. </Callout> </Steps>

That's it! You can now start chatting in LobeHub using models provided by Vercel AI Gateway.

Model Selection

Vercel AI Gateway supports a variety of model providers, including:

  • OpenAI: openai/gpt-4o, openai/gpt-4o-mini, openai/o1, and more
  • Anthropic: anthropic/claude-3-5-sonnet, anthropic/claude-3-opus, and more
  • Google: google/gemini-2.5-pro, google/gemini-2.0-flash, and more
  • DeepSeek: deepseek/deepseek-chat, deepseek/deepseek-reasoner, and more
  • And many others...

To view the full list of supported models, visit the Vercel AI Gateway Models page.

API Configuration

Vercel AI Gateway uses an OpenAI-compatible API format. The base URL is:

https://ai-gateway.vercel.sh/v1

You can use any OpenAI-compatible client with this endpoint and your API key.