docs/content/Guides/How-to-use-different-LLM.mdx
import { Callout } from 'nextra/components' import Image from 'next/image' import { Steps } from 'nextra/components'
Setting up local language models for your app can significantly enhance its capabilities, enabling it to understand and generate text in multiple languages without relying on external APIs. By integrating local language models, you can improve privacy, reduce latency, and ensure continuous functionality even in offline environments. Here's a comprehensive guide on how to set up local language models for your application:
DocsGPT can automatically switch to a fallback LLM when the primary model fails, including mid-stream. This works with both streaming and non-streaming requests.
Fallback order:
FALLBACK_LLM_* env vars below)| Setting | Description | Default |
|---|---|---|
FALLBACK_LLM_PROVIDER | Provider name (e.g., openai, anthropic, google) | — |
FALLBACK_LLM_NAME | Model name (e.g., gpt-4o, claude-sonnet-4-20250514) | — |
FALLBACK_LLM_API_KEY | API key for the fallback provider | Falls back to API_KEY |
All three (FALLBACK_LLM_PROVIDER, FALLBACK_LLM_NAME, and an API key) must resolve for the global fallback to activate.
FALLBACK_LLM_PROVIDER=anthropic
FALLBACK_LLM_NAME=claude-sonnet-4-20250514
FALLBACK_LLM_API_KEY=sk-ant-your-anthropic-key