Back to Llama Index

Vercel AI Gateway

docs/examples/llm/vercel-ai-gateway.ipynb

0.14.212.4 KB
Original Source

<a href="https://colab.research.google.com/github/run-llama/llama_index/blob/main/docs/examples/llm/vercel-ai-gateway.ipynb" target="_parent"></a>

Vercel AI Gateway

The AI Gateway is a proxy service from Vercel that routes model requests to various AI providers. It offers a unified API to multiple providers and gives you the ability to set budgets, monitor usage, load-balance requests, and manage fallbacks. You can find out more from their docs

If you're opening this Notebook on colab, you will probably need to install LlamaIndex 🦙.

python
%pip install llama-index-llms-vercel-ai-gateway
python
!pip install llama-index
python
from llama_index.llms.vercel_ai_gateway import VercelAIGateway
from llama_index.core.llms import ChatMessage

llm = VercelAIGateway(
    model="anthropic/claude-4-sonnet",
    max_tokens=64000,
    context_window=200000,
    api_key="your-api-key",
    default_headers={
        "http-referer": "https://myapp.com/",  # Optional: Your app URL
        "x-title": "My App",  # Optional: Your app name
    },
)

print(llm.model)

Call chat with ChatMessage List

You need to either set env var VERCEL_AI_GATEWAY_API_KEY or VERCEL_OIDC_TOKEN or set api_key in the class constructor

python
# import os
# os.environ['VERCEL_AI_GATEWAY_API_KEY'] = '<your-api-key>'

llm = VercelAIGateway(
    api_key="pBiuCWfswZCDxt8D50DSoBfU",
    max_tokens=64000,
    context_window=200000,
    model="anthropic/claude-4-sonnet",
)
python
message = ChatMessage(role="user", content="Tell me a joke")
resp = llm.chat([message])
print(resp)

Streaming

python
message = ChatMessage(role="user", content="Tell me a story in 250 words")
resp = llm.stream_chat([message])
for r in resp:
    print(r.delta, end="")

Call complete with Prompt

python
resp = llm.complete("Tell me a joke")
print(resp)
python
resp = llm.stream_complete("Tell me a story in 250 words")
for r in resp:
    print(r.delta, end="")

Model Configuration

python
# This example uses Anthropic's Claude 4 Sonnet (models are specified as `provider/model`):
llm = VercelAIGateway(
    model="anthropic/claude-4-sonnet",
    api_key="pBiuCWfswZCDxt8D50DSoBfU",
)
python
resp = llm.complete("Write a story about a dragon who can code in Rust")
print(resp)