Back to Strix

Overview

docs/llm-providers/overview.mdx

0.8.31.9 KB
Original Source

Strix uses LiteLLM for model compatibility, supporting 100+ LLM providers.

Configuration

Set your model and API key:

ModelProviderConfiguration
GPT-5.4OpenAIopenai/gpt-5.4
Claude Sonnet 4.6Anthropicanthropic/claude-sonnet-4-6
Gemini 3 ProGoogle Vertexvertex_ai/gemini-3-pro-preview
bash
export STRIX_LLM="openai/gpt-5.4"
export LLM_API_KEY="your-api-key"

Local Models

Run models locally with Ollama, LM Studio, or any OpenAI-compatible server:

bash
export STRIX_LLM="ollama/llama4"
export LLM_API_BASE="http://localhost:11434"

See the Local Models guide for setup instructions and recommended models.

Provider Guides

<CardGroup cols={2}> <Card title="OpenAI" href="/llm-providers/openai"> GPT-5.4 models. </Card> <Card title="Anthropic" href="/llm-providers/anthropic"> Claude Opus, Sonnet, and Haiku. </Card> <Card title="OpenRouter" href="/llm-providers/openrouter"> Access 100+ models through a single API. </Card> <Card title="Google Vertex AI" href="/llm-providers/vertex"> Gemini 3 models via Google Cloud. </Card> <Card title="AWS Bedrock" href="/llm-providers/bedrock"> Claude and Titan models via AWS. </Card> <Card title="Azure OpenAI" href="/llm-providers/azure"> GPT-5.4 via Azure. </Card> <Card title="Local Models" href="/llm-providers/local"> Llama 4, Mistral, and self-hosted models. </Card> </CardGroup>

Model Format

Use LiteLLM's provider/model-name format:

openai/gpt-5.4
anthropic/claude-sonnet-4-6
vertex_ai/gemini-3-pro-preview
bedrock/anthropic.claude-4-5-sonnet-20251022-v1:0
ollama/llama4