docs/llm-providers/overview.mdx
Strix uses LiteLLM for model compatibility, supporting 100+ LLM providers.
Set your model and API key:
| Model | Provider | Configuration |
|---|---|---|
| GPT-5.4 | OpenAI | openai/gpt-5.4 |
| Claude Sonnet 4.6 | Anthropic | anthropic/claude-sonnet-4-6 |
| Gemini 3 Pro | Google Vertex | vertex_ai/gemini-3-pro-preview |
export STRIX_LLM="openai/gpt-5.4"
export LLM_API_KEY="your-api-key"
Run models locally with Ollama, LM Studio, or any OpenAI-compatible server:
export STRIX_LLM="ollama/llama4"
export LLM_API_BASE="http://localhost:11434"
See the Local Models guide for setup instructions and recommended models.
Use LiteLLM's provider/model-name format:
openai/gpt-5.4
anthropic/claude-sonnet-4-6
vertex_ai/gemini-3-pro-preview
bedrock/anthropic.claude-4-5-sonnet-20251022-v1:0
ollama/llama4