document/content/docs/self-host/troubleshooting/faq.en.mdx
object parameters in the model are abnormal (arrays and objects). If empty, try giving an empty array or empty object.This is the length limit of the indexing model. It is the same regardless of the deployment method, but the configuration of different indexing models is different, and parameters can be modified in the background.
Mount the verification file to the specified location: /app/projects/app/public/xxxx.txt
Then restart. For example:
Change the port mapping to 3307 or similar, for example 3307:3306.
Yes. You need to prepare the vector model and LLM model.
toolChoice=false and functionCall=false, and it will default to the prompt mode. Currently, the built-in prompts are only tested for commercial model APIs. Question classification is basically usable, but content extraction is not very good.customCQPrompt.URI malformed, please Issue feedback specific operations and pages, this is due to special string encoding parsing errors.The page uses stream=true mode, so the API also needs to set stream=true for testing. Some model interfaces (mostly domestic) are a bit garbage in non-Stream compatibility. Same as the previous question, curl test.
First look at the log error information. There are several situations:
Network exception. Domestic servers cannot request OpenAI, check whether the connection with the AI model is normal.
Or FastGPT cannot request OneAPI (not in the same network).