apps/opik-documentation/documentation/fern/docs/production/gateway.mdx
An LLM gateway is a proxy server that forwards requests to an LLM API and returns the response. This is useful for when you want to centralize the access to LLM providers or when you want to be able to query multiple LLM providers from a single endpoint using a consistent request and response format.
Opik supports several LLM gateway solutions to help you centralize and manage your LLM provider access:
<CardGroup cols={3}> <Card title="Opik LLM Gateway" href="/v1/integrations/opik-llm-gateway" /> <Card title="Kong AI Gateway" href="/v1/integrations/kong-ai-gateway" /> <Card title="AISuite" href="/v1/integrations/aisuite" /> <Card title="Anannas AI" href="/v1/integrations/anannas" /> <Card title="Helicone" href="/v1/integrations/helicone" /> <Card title="LiteLLM" href="/v1/integrations/litellm" /> <Card title="OpenRouter" href="/v1/integrations/openrouter" /> <Card title="Portkey" href="/v1/integrations/portkey" /> <Card title="TrueFoundry" href="/v1/integrations/truefoundry" /> <Card title="Vercel AI Gateway" href="/v1/integrations/vercel-ai-gateway" /> </CardGroup>