docs/guides/models/llm_api_key_setup.md
An API key is required for RAGFlow to interact with an online AI model. This guide provides information about setting your model API key in RAGFlow.
RAGFlow supports most mainstream LLMs. Please refer to Supported Models for a complete list of supported models. You will need to apply for your model API key online. Note that most LLM providers grant newly-created accounts trial credit, which will expire in a couple of months, or a promotional amount of free quota.
:::note If you find your online LLM is not on the list, don't feel disheartened. The list is expanding, and you can file a feature request with us! Alternatively, if you have customized or locally-deployed models, you can bind them to RAGFlow using Ollama, Xinference, or LocalAI. :::
You have two options for configuring your model API key:
factory with your chosen LLM.api_key with yours.base_url if you use a proxy to connect to the remote service.:::caution WARNING After logging into RAGFlow, configuring your model API key through the service_conf.yaml.template file will no longer take effect. :::
After logging into RAGFlow, you can only configure API Key on the Model providers page: