docs/provider-config/openai-compatible.mdx
Cline supports a wide range of AI model providers that offer APIs compatible with the OpenAI API standard. This allows you to use models from providers other than OpenAI, while still utilizing a familiar API interface. This includes providers such as:
This document focuses on setting up providers other than the official OpenAI API (which has its own dedicated configuration page).
The key to using an OpenAI-compatible provider with Cline is to configure these main settings:
https://api.openai.com/v1 (that URL is for the official OpenAI API).You'll find these settings in the Cline settings panel (click the ⚙️ icon):
While the "OpenAI Compatible" provider type allows connecting to various endpoints, if you are connecting directly to the official OpenAI API (or an endpoint that mirrors it exactly), Cline recognizes the following model IDs based on the openAiNativeModels definition in its source code:
o3-minio3-mini-higho3-mini-lowo1o1-previewo1-minigpt-4ogpt-4o-miniNote: If you are using a different OpenAI-compatible provider (such as Together AI, Anyscale, etc.), the available model IDs will differ. Always refer to your specific provider's documentation for their supported model names and any unique configuration details.
For developers working with v0, their AI SDK documentation provides valuable insights and examples for integrating various models, many of which are OpenAI-compatible. This can be a helpful resource for understanding how to structure calls and manage configurations when using Cline with services deployed on or integrated with Vercel.
v0 can be used in Cline with the OpenAI Compatible provider.
By using an OpenAI-compatible provider, you can leverage the flexibility of Cline with a wider array of AI models. Remember to always consult your provider's documentation for the most accurate and up-to-date information.