kirara_ai/web/api/llm/README.md
大语言模型 API 提供了管理 LLM 后端和适配器的功能。通过这些 API,你可以注册、配置和管理不同的大语言模型服务。
GET/backend-api/api/llm/types
获取所有可用的 LLM 适配器类型。
响应示例:
{
"types": [
"openai",
"anthropic",
"azure",
"local"
]
}
GET/backend-api/api/llm/backends
获取所有已注册的 LLM 后端信息。
响应示例:
{
"data": {
"backends": [
{
"name": "openai",
"adapter": "openai",
"config": {
"api_key": "sk-xxx",
"api_base": "https://api.openai.com/v1"
},
"enable": true,
"models": ["gpt-4", "gpt-3.5-turbo"]
}
]
}
}
GET/backend-api/api/llm/backends/{backend_name}
获取指定后端的详细信息。
响应示例:
{
"data": {
"name": "anthropic",
"adapter": "anthropic",
"config": {
"api_key": "sk-xxx",
"api_base": "https://api.anthropic.com"
},
"enable": true,
"models": ["claude-3-opus", "claude-3-sonnet"]
}
}
POST/backend-api/api/llm/backends
注册新的 LLM 后端。
请求体:
{
"name": "anthropic",
"adapter": "anthropic",
"config": {
"api_key": "sk-xxx",
"api_base": "https://api.anthropic.com"
},
"enable": true,
"models": ["claude-3-opus", "claude-3-sonnet"]
}
PUT/backend-api/api/llm/backends/{backend_name}
更新现有后端的配置。
请求体:
{
"name": "anthropic",
"adapter": "anthropic",
"config": {
"api_key": "sk-xxx",
"api_base": "https://api.anthropic.com",
"temperature": 0.7
},
"enable": true,
"models": ["claude-3-opus", "claude-3-sonnet"]
}
DELETE/backend-api/api/llm/backends/{backend_name}
删除指定的后端。如果后端当前已启用,会先自动卸载。
GET/backend-api/api/llm/types/{adapter_type}/config-schema
获取指定适配器类型的配置字段模式。
响应示例:
{
"schema": {
"title": "OpenAIConfig",
"type": "object",
"properties": {
"api_key": {
"title": "API Key",
"type": "string",
"description": "OpenAI API密钥"
},
"api_base": {
"title": "API Base",
"type": "string",
"description": "API基础URL",
"default": "https://api.openai.com/v1"
},
"temperature": {
"title": "Temperature",
"type": "number",
"description": "生成温度",
"default": 0.7,
"minimum": 0,
"maximum": 2
}
},
"required": ["api_key"]
}
}
name: 后端名称adapter: 适配器类型config: 配置信息(字典)enable: 是否启用models: 支持的模型列表backends: LLM 后端列表error: 错误信息(可选)data: 后端信息(可选)error: 错误信息(可选)data: 后端列表(可选)types: 可用的适配器类型列表error: 错误信息(可选)schema: JSON Schema 格式的配置字段描述适配器由插件提供,见适配器实现。
目前自带支持的适配器类型包括:
openaiapi_key: API 密钥api_base: API 基础 URLtemperature: 温度参数(可选)anthropicapi_key: API 密钥api_base: API 基础 URLtemperature: 温度参数(可选)azureapi_key: API 密钥api_base: Azure 终结点deployment_name: 部署名称localmodel_path: 模型路径device: 运行设备(cpu/cuda)所有 API 端点在发生错误时都会返回适当的 HTTP 状态码和错误信息:
{
"error": "错误描述信息"
}
常见状态码:
import requests
response = requests.get(
'http://localhost:8080/api/llm/types',
headers={'Authorization': f'Bearer {token}'}
)
import requests
backend_data = {
"name": "anthropic",
"adapter": "anthropic",
"config": {
"api_key": "sk-xxx",
"api_base": "https://api.anthropic.com"
},
"enable": true,
"models": ["claude-3-opus", "claude-3-sonnet"]
}
response = requests.post(
'http://localhost:8080/api/llm/backends',
headers={'Authorization': f'Bearer {token}'},
json=backend_data
)
import requests
backend_data = {
"name": "anthropic",
"adapter": "anthropic",
"config": {
"api_key": "sk-xxx",
"api_base": "https://api.anthropic.com",
"temperature": 0.7
},
"enable": true,
"models": ["claude-3-opus", "claude-3-sonnet"]
}
response = requests.put(
'http://localhost:8080/api/llm/backends/anthropic',
headers={'Authorization': f'Bearer {token}'},
json=backend_data
)