docs/changelog/2025-03-02-new-models.mdx
LobeHub completed its largest AI ecosystem expansion this February. The goal is straightforward: give you access to more capable models across more providers without breaking your existing workflow or requiring new setup steps.
This release adds 10+ mainstream providers spanning global and domestic platforms. You can now connect to a wider range of services directly from your existing LobeHub setup.
DeepSeek R1 is now fully supported, and reasoning-model compatibility has expanded to include Claude 3.7 Sonnet and OpenAI o3-mini. These models display their chain-of-thought in real time, so you can follow how conclusions are reached. DeepSeek R1 parsing is consistent across providers, making reasoning output easier to read in daily use.
Online search has been upgraded with SearchXNG and Perplexity integration, plus support for deep web crawling. Gemini 2.0 and Qwen series models can now use native search as part of their reasoning process.
Use these updates to:
This release updates 50+ model configurations to keep your options current:
Huge thanks to these contributors:
@AmAzing129 @hezhijie0327 @arvinxx @lobehub-team