docs/API_KEY_AUTH.md
LangBot now supports API key authentication for external systems to access its HTTP service API.
API keys can be managed through the web interface:
Include your API key in the request header using one of these methods:
Method 1: X-API-Key header (Recommended)
X-API-Key: lbk_your_api_key_here
Method 2: Authorization Bearer token
Authorization: Bearer lbk_your_api_key_here
All existing LangBot APIs now support both user token and API key authentication. This means you can use API keys to access:
/api/v1/provider/models/llm and /api/v1/provider/models/embedding/api/v1/platform/bots/api/v1/pipelines/api/v1/knowledge/*/api/v1/mcp/serversEach endpoint accepts either:
Authorization: Bearer <user_jwt_token>) - for web UI and authenticated usersX-API-Key or Authorization: Bearer <api_key>) - for external servicesGET /api/v1/provider/models/llm
X-API-Key: lbk_your_api_key_here
Response:
{
"code": 0,
"msg": "ok",
"data": {
"models": [
{
"uuid": "model-uuid",
"name": "GPT-4",
"description": "OpenAI GPT-4 model",
"requester": "openai-chat-completions",
"requester_config": {...},
"abilities": ["chat", "vision"],
"created_at": "2024-01-01T00:00:00",
"updated_at": "2024-01-01T00:00:00"
}
]
}
}
POST /api/v1/provider/models/llm
X-API-Key: lbk_your_api_key_here
Content-Type: application/json
{
"name": "My Custom Model",
"description": "Description of the model",
"requester": "openai-chat-completions",
"requester_config": {
"model": "gpt-4",
"args": {}
},
"api_keys": [
{
"name": "default",
"keys": ["sk-..."]
}
],
"abilities": ["chat"],
"extra_args": {}
}
PUT /api/v1/provider/models/llm/{model_uuid}
X-API-Key: lbk_your_api_key_here
Content-Type: application/json
{
"name": "Updated Model Name",
"description": "Updated description",
...
}
DELETE /api/v1/provider/models/llm/{model_uuid}
X-API-Key: lbk_your_api_key_here
GET /api/v1/platform/bots
X-API-Key: lbk_your_api_key_here
POST /api/v1/platform/bots
X-API-Key: lbk_your_api_key_here
Content-Type: application/json
{
"name": "My Bot",
"adapter": "telegram",
"config": {...}
}
GET /api/v1/pipelines
X-API-Key: lbk_your_api_key_here
POST /api/v1/pipelines
X-API-Key: lbk_your_api_key_here
Content-Type: application/json
{
"name": "My Pipeline",
"config": {...}
}
{
"code": -1,
"msg": "No valid authentication provided (user token or API key required)"
}
or
{
"code": -1,
"msg": "Invalid API key"
}
{
"code": -1,
"msg": "Resource not found"
}
{
"code": -2,
"msg": "Error message details"
}
X-API-Key header for clarityimport requests
API_KEY = "lbk_your_api_key_here"
BASE_URL = "http://your-langbot-server:5300"
headers = {
"X-API-Key": API_KEY,
"Content-Type": "application/json"
}
# List all models
response = requests.get(f"{BASE_URL}/api/v1/provider/models/llm", headers=headers)
models = response.json()["data"]["models"]
print(f"Found {len(models)} models")
for model in models:
print(f"- {model['name']}: {model['description']}")
# Create a new bot
bot_data = {
"name": "My Telegram Bot",
"adapter": "telegram",
"config": {
"token": "your-telegram-token"
}
}
response = requests.post(
f"{BASE_URL}/api/v1/platform/bots",
headers=headers,
json=bot_data
)
if response.status_code == 200:
bot_uuid = response.json()["data"]["uuid"]
print(f"Bot created with UUID: {bot_uuid}")
# List all models
curl -X GET \
-H "X-API-Key: lbk_your_api_key_here" \
http://your-langbot-server:5300/api/v1/provider/models/llm
# Create a new pipeline
curl -X POST \
-H "X-API-Key: lbk_your_api_key_here" \
-H "Content-Type: application/json" \
-d '{
"name": "My Pipeline",
"config": {...}
}' \
http://your-langbot-server:5300/api/v1/pipelines
# Get bot logs
curl -X POST \
-H "X-API-Key: lbk_your_api_key_here" \
-H "Content-Type: application/json" \
-d '{
"from_index": -1,
"max_count": 10
}' \
http://your-langbot-server:5300/api/v1/platform/bots/{bot_uuid}/logs