docs/usage/providers/vercel-ai-gateway.mdx
Vercel AI Gateway is a unified API that provides access to over 100 AI models through a single endpoint. It offers features such as budget management, usage monitoring, load balancing, and fallback handling.
This guide will walk you through how to use Vercel AI Gateway within LobeHub.
<Steps> ### Step 1: Create an API Key in Vercel AI GatewaySettings page in LobeHubAI Service Provider, find the Vercel AI Gateway section<Callout type={'warning'}> You may incur charges from the API service provider during usage. Please refer to the Vercel AI Gateway pricing policy for details. </Callout> </Steps>
That's it! You can now start chatting in LobeHub using models provided by Vercel AI Gateway.
Vercel AI Gateway supports a variety of model providers, including:
openai/gpt-4o, openai/gpt-4o-mini, openai/o1, and moreanthropic/claude-3-5-sonnet, anthropic/claude-3-opus, and moregoogle/gemini-2.5-pro, google/gemini-2.0-flash, and moredeepseek/deepseek-chat, deepseek/deepseek-reasoner, and moreTo view the full list of supported models, visit the Vercel AI Gateway Models page.
Vercel AI Gateway uses an OpenAI-compatible API format. The base URL is:
https://ai-gateway.vercel.sh/v1
You can use any OpenAI-compatible client with this endpoint and your API key.