docs/full-guide_EN.md
This document contains the complete configuration guide for the AI Stock Analysis System, intended for users who need advanced features or special deployment methods.
Quick start guide available in README_EN.md. This document covers advanced configuration.
daily_stock_analysis/
├── main.py # Main entry point
├── src/ # Core business logic
│ ├── analyzer.py # AI analyzer
│ ├── config.py # Configuration management
│ ├── notification.py # Message push notifications
│ └── ...
├── data_provider/ # Multi-source data adapters
├── bot/ # Bot interaction module
├── api/ # FastAPI backend service
├── apps/dsa-web/ # React frontend
├── docker/ # Docker configuration
├── docs/ # Project documentation
└── .github/workflows/ # GitHub Actions
Click the Fork button in the upper right corner.
Go to your forked repo → Settings → Secrets and variables → Actions → New repository secret
Note: The configuration below documents existing runtime provider support and compatibility boundaries; this update is documentation-alignment only and does not introduce new runtime implementation.
| Secret Name | Description | Required |
|---|---|---|
ANSPIRE_API_KEYS | Anspire API key, one key for popular LLMs and Chinese-optimized web search with free quota for this project | Recommended |
AIHUBMIX_KEY | AIHubMix API key, one key for multiple model families and a 10% top-up discount for this project | Recommended |
GEMINI_API_KEY | Get free key from Google AI Studio | Optional |
ANTHROPIC_API_KEY | Anthropic Claude API Key | Optional |
OPENAI_API_KEY | OpenAI-compatible API Key (supports DeepSeek, Qwen, etc.) | Optional |
OPENAI_BASE_URL | OpenAI-compatible API endpoint (e.g., https://api.deepseek.com) | Optional |
OPENAI_MODEL | Model name (e.g., deepseek-v4-flash) | Optional |
*Note: Configure at least one model key or channel. Anspire or AIHubMix is the simplest starting point for one-key multi-model access.
| Secret Name | Description | Required |
|---|---|---|
WECHAT_WEBHOOK_URL | WeChat Work Webhook URL | Optional |
FEISHU_WEBHOOK_URL | Feishu Webhook URL | Optional |
FEISHU_WEBHOOK_SECRET | Feishu Webhook signing secret (required when “Signature” security is enabled) | Optional |
FEISHU_WEBHOOK_KEYWORD | Feishu Webhook keyword (required when “Keyword” security is enabled) | Optional |
TELEGRAM_BOT_TOKEN | Telegram Bot Token (get from @BotFather) | Optional |
TELEGRAM_CHAT_ID | Telegram Chat ID | Optional |
TELEGRAM_MESSAGE_THREAD_ID | Telegram Topic ID (for sending to topics) | Optional |
DISCORD_WEBHOOK_URL | Discord Webhook URL (How to create) | Optional |
DISCORD_BOT_TOKEN | Discord Bot Token (choose one with Webhook) | Optional |
DISCORD_MAIN_CHANNEL_ID | Discord Channel ID (required when using Bot) | Optional |
DISCORD_INTERACTIONS_PUBLIC_KEY | Discord Public Key (required only for inbound Interaction/Webhook signature verification) | Optional |
SLACK_BOT_TOKEN | Slack Bot Token (recommended, supports image upload; takes priority over Webhook when both set) | Optional |
SLACK_CHANNEL_ID | Slack Channel ID (required when using Bot) | Optional |
SLACK_WEBHOOK_URL | Slack Incoming Webhook URL (text only, no image support) | Optional |
EMAIL_SENDER | Sender email (e.g., [email protected]) | Optional |
EMAIL_PASSWORD | Email authorization code (not login password) | Optional |
EMAIL_RECEIVERS | Receiver emails (comma-separated, leave empty to send to self) | Optional |
EMAIL_SENDER_NAME | Sender display name | Optional |
STOCK_GROUP_N / EMAIL_GROUP_N | Email routing groups (Issue #268): STOCK_GROUP_N should be a subset of STOCK_LIST; affects email recipients only, not analysis scope or other channels | Optional |
PUSHPLUS_TOKEN | PushPlus Token (Get here, Chinese push service) | Optional |
SERVERCHAN3_SENDKEY | ServerChan v3 Sendkey (Get here, mobile app push service) | Optional |
CUSTOM_WEBHOOK_URLS | Custom Webhook (supports DingTalk, etc., comma-separated) | Optional |
CUSTOM_WEBHOOK_BEARER_TOKEN | Bearer Token for custom webhooks (for authenticated webhooks) | Optional |
CUSTOM_WEBHOOK_BODY_TEMPLATE | Custom Webhook JSON body template for AstrBot, NapCat, or self-hosted services with special payloads | Optional |
WEBHOOK_VERIFY_SSL | Verify Webhook HTTPS certificates (default true). Set to false for self-signed certs. WARNING: Disabling has serious security risk (MITM), use only on trusted internal networks | Optional |
*Note: Configure at least one channel; multiple channels will all receive notifications
The default
daily_analysis.ymlin this repository only exports fixed Secret / Variable names. Arbitrary numbered env vars such asSTOCK_GROUP_1andEMAIL_GROUP_1are not auto-injected into the job, so grouped email routing is not available in the stock workflow unless you explicitly extend the workflow'senv:mapping in your own fork.
| Secret Name | Description | Required |
|---|---|---|
SINGLE_STOCK_NOTIFY | Single stock push mode: set to true to push immediately after each stock analysis | Optional |
REPORT_TYPE | Report type: simple (concise), full (complete), brief (3-5 sentences), Docker recommended: full | Optional |
REPORT_LANGUAGE | Report output language: zh (default Chinese) / en (English); also updates prompt instructions, templates, notification fallbacks, and fixed copy in the Web report view. The bundled daily_analysis.yml already maps this variable, so setting it in Actions Secrets/Variables works out of the box | Optional |
REPORT_TEMPLATES_DIR | Jinja2 template directory (relative to project root, default templates) | Optional |
REPORT_RENDERER_ENABLED | Enable Jinja2 template rendering (default false, zero regression) | Optional |
REPORT_INTEGRITY_ENABLED | Enable report integrity checks, retry or placeholder on missing fields (default true) | Optional |
REPORT_INTEGRITY_RETRY | Integrity retry count (default 1, 0 = placeholder only) | Optional |
REPORT_HISTORY_COMPARE_N | History signal comparison count, 0 off (default), >0 enable | Optional |
ANALYSIS_DELAY | Delay between stock analysis and market review (seconds) to avoid API rate limits, e.g., 10 | Optional |
| Secret Name | Description | Required |
|---|---|---|
STOCK_LIST | Watchlist codes, e.g., 600519,300750,002594 | ✅ |
ANSPIRE_API_KEYS | Anspire AI Search optimized for Chinese content; the same key can also be used for Anspire LLM fallback scenarios (example model: Doubao-Seed-2.0-lite) | Recommended |
SERPAPI_API_KEYS | SerpAPI search-engine results for realtime financial news | Recommended |
TAVILY_API_KEYS | Tavily Search API (for news search) | Optional |
BOCHA_API_KEYS | Bocha Search Web Search API (Chinese search optimized, supports AI summaries, multiple keys comma-separated) | Optional |
BRAVE_API_KEYS | Brave Search API (privacy-first, US-stock news enrichment, comma-separated for multiple keys) | Optional |
MINIMAX_API_KEYS | MiniMax Coding Plan Web Search (structured search results) | Optional |
SEARXNG_BASE_URLS | SearXNG self-hosted instances (quota-free fallback, enable format: json in settings.yml); when empty the app auto-discovers public instances | Optional |
SEARXNG_PUBLIC_INSTANCES_ENABLED | Auto-discover public SearXNG instances from searx.space when SEARXNG_BASE_URLS is empty (default true) | Optional |
TUSHARE_TOKEN | Tushare Pro Token | Optional |
TICKFLOW_API_KEY | TickFlow API key for CN market review index enhancement; market breadth also uses TickFlow when the plan supports universe queries | Optional |
To get started quickly, you need at minimum:
ANSPIRE_API_KEYS (one key for LLMs and search), AIHUBMIX_KEY (one key for multiple model families), GEMINI_API_KEY, or OPENAI_API_KEYWECHAT_WEBHOOK_URL or EMAIL_SENDER + EMAIL_PASSWORDSTOCK_LIST (required)ANSPIRE_API_KEYS or SERPAPI_API_KEYS (recommended for news and sentiment search)Configure these 4 items and you're ready to go!
Actions tab at the topI understand my workflows, go ahead and enable themActions tabDaily Stock Analysis workflow on the leftRun workflow button on the rightRun workflow to confirmDefault schedule: Every weekday at 18:00 (Beijing Time) automatic execution.
Full details: LLM Config Guide (three-tier config, channels, Vision, Agent, troubleshooting).
| Variable | Description | Default | Required |
|---|---|---|---|
LITELLM_MODEL | Primary model, format provider/model (e.g. gemini/gemini-3.1-pro-preview), recommended | - | No |
AGENT_LITELLM_MODEL | Optional Agent-only primary model; when empty it inherits the primary model, and bare names are normalized to openai/<model> | - | No |
LITELLM_FALLBACK_MODELS | Fallback models, comma-separated | - | No |
LLM_CHANNELS | Channel names (comma-separated), use with LLM_{NAME}_*, see LLM Config Guide | - | No |
LITELLM_CONFIG | Advanced model routing YAML path (expert use) | - | No |
ANSPIRE_API_KEYS | Anspire API key, one key for the LLM gateway and search | - | Optional |
AIHUBMIX_KEY | AIHubMix API key, one key for multiple model families | - | Optional |
GEMINI_API_KEY | Google Gemini API Key | - | Optional |
GEMINI_MODEL | Primary model name (legacy, LITELLM_MODEL preferred) | gemini-3.1-pro-preview | No |
GEMINI_MODEL_FALLBACK | Fallback model (legacy) | gemini-3-flash-preview | No |
ANTHROPIC_API_KEY | Anthropic Claude API Key | - | Optional |
OPENAI_API_KEY | OpenAI-compatible API Key | - | Optional |
OPENAI_BASE_URL | OpenAI-compatible API endpoint | - | Optional |
OLLAMA_API_BASE | Ollama local service address (e.g. http://localhost:11434), see LLM Config Guide | - | Optional |
OPENAI_MODEL | OpenAI model name (legacy) | gpt-5.5 | Optional |
*Note: Configure at least one of
ANSPIRE_API_KEYS,AIHUBMIX_KEY,GEMINI_API_KEY,ANTHROPIC_API_KEY,OPENAI_API_KEY,OLLAMA_API_BASE, orLLM_CHANNELS/LITELLM_CONFIG.ANSPIRE_API_KEYSandAIHUBMIX_KEYare auto-adapted without anOPENAI_BASE_URL.
| Variable | Description | Required |
|---|---|---|
WECHAT_WEBHOOK_URL | WeChat Work Bot Webhook URL | Optional |
FEISHU_WEBHOOK_URL | Feishu Bot Webhook URL | Optional |
FEISHU_WEBHOOK_SECRET | Feishu bot signing secret (only for webhook bots with Signature security enabled) | Optional |
FEISHU_WEBHOOK_KEYWORD | Feishu bot keyword (only for webhook bots with Keyword security enabled) | Optional |
TELEGRAM_BOT_TOKEN | Telegram Bot Token | Optional |
TELEGRAM_CHAT_ID | Telegram Chat ID | Optional |
TELEGRAM_MESSAGE_THREAD_ID | Telegram Topic ID | Optional |
DISCORD_WEBHOOK_URL | Discord Webhook URL | Optional |
DISCORD_BOT_TOKEN | Discord Bot Token (choose one with Webhook) | Optional |
DISCORD_MAIN_CHANNEL_ID | Discord Channel ID (required when using Bot) | Optional |
DISCORD_INTERACTIONS_PUBLIC_KEY | Discord Public Key (required only for inbound Interaction/Webhook signature verification) | Optional |
DISCORD_MAX_WORDS | Discord Word Limit (default 2000 for un-upgraded servers) | Optional |
SLACK_BOT_TOKEN | Slack Bot Token (recommended, supports image upload; takes priority over Webhook when both set) | Optional |
SLACK_CHANNEL_ID | Slack Channel ID (required when using Bot) | Optional |
SLACK_WEBHOOK_URL | Slack Incoming Webhook URL (text only, no image support) | Optional |
EMAIL_SENDER | Sender email | Optional |
EMAIL_PASSWORD | Email authorization code (not login password) | Optional |
EMAIL_RECEIVERS | Receiver emails (comma-separated, leave empty to send to self) | Optional |
EMAIL_SENDER_NAME | Sender display name | Optional |
STOCK_GROUP_N / EMAIL_GROUP_N | Email routing groups (Issue #268): STOCK_GROUP_N should stay within STOCK_LIST and only changes email recipients | Optional |
CUSTOM_WEBHOOK_URLS | Custom Webhook (comma-separated) | Optional |
CUSTOM_WEBHOOK_BEARER_TOKEN | Custom Webhook Bearer Token | Optional |
WEBHOOK_VERIFY_SSL | Webhook HTTPS certificate verification (default true). Set to false for self-signed certs. WARNING: Disabling has serious security risk | Optional |
PUSHOVER_USER_KEY | Pushover User Key | Optional |
PUSHOVER_API_TOKEN | Pushover API Token | Optional |
PUSHPLUS_TOKEN | PushPlus Token (Chinese push service) | Optional |
SERVERCHAN3_SENDKEY | ServerChan v3 Sendkey | Optional |
Note: the default
daily_analysisGitHub Actions workflow only maps fixed variable names. It does not automatically import arbitrary numbered variables such asSTOCK_GROUP_N/EMAIL_GROUP_N. This feature therefore works in local.env, Docker, or any runtime where you explicitly inject those variables.
| Variable | Description | Required |
|---|---|---|
FEISHU_APP_ID | Feishu App ID | Optional |
FEISHU_APP_SECRET | Feishu App Secret | Optional |
FEISHU_FOLDER_TOKEN | Feishu Cloud Drive Folder Token | Optional |
Feishu Cloud Document setup steps:
- Create an app in Feishu Developer Console
- Configure GitHub Secrets
- Create a group and add the app bot
- Add the group as a collaborator to the cloud drive folder (with manage permissions)
Note:
FEISHU_APP_ID/FEISHU_APP_SECRETare for Feishu app mode, cloud documents, or Stream Bot mode. They do not enable group webhook notifications by themselves. For simple push notifications, useFEISHU_WEBHOOK_URLfirst.
| Variable | Description | Required |
|---|---|---|
ANSPIRE_API_KEYS | Anspire Open API Key (shared with search and LLM fallback examples; availability depends on account/model entitlement, and can effectively enhance A-share analysis) | Recommended |
SERPAPI_API_KEYS | SerpAPI search-engine results for realtime financial news | Recommended |
TAVILY_API_KEYS | Tavily Search API Key | Optional |
BOCHA_API_KEYS | Bocha Search API Key (Chinese optimized) | Optional |
BRAVE_API_KEYS | Brave Search API Key (US stocks optimized) | Optional |
MINIMAX_API_KEYS | MiniMax Coding Plan Web Search (structured results) | Optional |
SOCIAL_SENTIMENT_API_KEY | Stock Sentiment API Key (Reddit / X / Polymarket, US stocks optional) | Optional |
SOCIAL_SENTIMENT_API_URL | Stock Sentiment API endpoint (default https://api.adanos.org) | Optional |
SEARXNG_BASE_URLS | SearXNG self-hosted instances (quota-free fallback, enable format: json in settings.yml); when empty the app auto-discovers public instances | Optional |
SEARXNG_PUBLIC_INSTANCES_ENABLED | Auto-discover public SearXNG instances from searx.space when SEARXNG_BASE_URLS is empty (default true) | Optional |
Behavior note: Search and social sentiment are optional enhancement services. If either service fails to initialize, the system logs a warning and degrades gracefully by skipping that stage without blocking the core analysis flow.
| Variable | Description | Default | Required |
|---|---|---|---|
TUSHARE_TOKEN | Tushare Pro Token | - | Optional |
TICKFLOW_API_KEY | TickFlow API key; CN market review indices prefer TickFlow when configured, and market breadth does so only when the plan supports universe queries | - | Optional |
ENABLE_REALTIME_QUOTE | Enable real-time quotes (if disabled, uses historical closing prices for analysis) | true | Optional |
ENABLE_REALTIME_TECHNICAL_INDICATORS | Intraday real-time technicals: Calculate MA5/MA10/MA20 and bull trends using real-time prices when enabled (Issue #234); uses yesterday's close if disabled. | true | Optional |
ENABLE_CHIP_DISTRIBUTION | Enable chip distribution analysis (this API is unstable, recommended to disable for cloud deployment). GitHub Actions users must set ENABLE_CHIP_DISTRIBUTION=true in Repository Variables to enable; disabled by default in workflows. | true | Optional |
ENABLE_EASTMONEY_PATCH | Eastmoney API patch: Recommended to set to true when Eastmoney APIs fail frequently (e.g., RemoteDisconnected, connection closed). Injects NID tokens and random User-Agents to reduce rate limiting probability. | false | Optional |
REALTIME_SOURCE_PRIORITY | Real-time quote source priority (comma-separated), e.g., tencent,akshare_sina,efinance,akshare_em | See .env.example | Optional |
ENABLE_FUNDAMENTAL_PIPELINE | Master switch for fundamental aggregation; when disabled, returns not_supported block only, without altering the original analysis pipeline. | true | Optional |
FUNDAMENTAL_STAGE_TIMEOUT_SECONDS | Total latency budget for the fundamental stage (seconds) | 1.5 | Optional |
FUNDAMENTAL_FETCH_TIMEOUT_SECONDS | Timeout for a single capability source call (seconds) | 0.8 | Optional |
FUNDAMENTAL_RETRY_MAX | Retry count for fundamental capabilities (including the first attempt) | 1 | Optional |
FUNDAMENTAL_CACHE_TTL_SECONDS | Fundamental aggregation cache TTL (seconds), short cache to reduce repeated API pulling. | 120 | Optional |
FUNDAMENTAL_CACHE_MAX_ENTRIES | Maximum entries for fundamental cache (evicted by time within TTL) | 256 | Optional |
Behavior Notes:
- A-shares: Returns aggregated capabilities by
valuation/growth/earnings/institution/capital_flow/dragon_tiger/boards.- ETFs: Returns available items, marks missing capabilities as
not_supported, and does not affect the original flow overall.- US/HK stocks: Returns
not_supportedfallback block.- Any exception uses fail-open logic, only logs errors without affecting the main technical/news/chip pipeline.
- Field contracts:
fundamental_context.belong_boards= related board list for the stock (currently populated for A-shares only;[]when unavailable);fundamental_context.boards.data=sector_rankings(sector rise/fall leaderboard, structure{top, bottom});get_stock_info.belong_boards= list of sectors the individual stock belongs to;get_stock_info.boardsis a compatibility alias, value is identical tobelong_boards(removal considered only in major version updates);get_stock_info.sector_rankingsstays consistent withfundamental_context.boards.data.AnalysisReport.details.belong_boards= related board list in structured report details;AnalysisReport.details.sector_rankings= sector leaderboard in structured report details for board-linkage display.- Sector leaderboard uses a fixed fallback order: consistent with global priority.
- Timeout control is a
best-effortsoft timeout: the stage will quickly degrade and continue execution based on the budget, but does not guarantee a hard interrupt of underlying third-party network calls.FUNDAMENTAL_STAGE_TIMEOUT_SECONDS=1.5indicates the target budget for the newly added fundamental stage, not a strict hard SLA.- For a hard SLA, please upgrade to isolated child process execution in future versions to forcefully terminate timeout tasks.
| Variable | Description | Default |
|---|---|---|
STOCK_LIST | Watchlist codes (comma-separated) | - |
MAX_WORKERS | Concurrent threads | 3 |
MARKET_REVIEW_ENABLED | Enable market review | true |
MARKET_REVIEW_REGION | Market review region: cn (A-shares), hk (HK stocks), us (US stocks), both (all three markets) | cn |
SCHEDULE_ENABLED | Enable scheduled tasks | false |
SCHEDULE_TIME | Scheduled execution time | 18:00 |
LOG_DIR | Log directory | ./logs |
Behavior notes:
- When
TICKFLOW_API_KEYis configured, CN market review first tries TickFlow for main indices. Market breadth also tries TickFlow only when the current TickFlow plan supports universe queries.- TickFlow behavior is capability-based rather than just key-based: limited plans can still enhance main CN indices, while plans with
CN_Equity_Auniverse query support also enhance market breadth.- The official quickstart documents
quotes.get(universes=["CN_Equity_A"]), but online smoke tests confirmed two additional real-world constraints: universe access depends on plan permissions, andquotes.get(symbols=[...])has a per-request symbol limit.- TickFlow currently returns
change_pct/amplitudeas ratio values; this integration normalizes them to the project's percent convention so they match AkShare / Tushare / efinance semantics.- CN market review reports now use a post-market workstation layout with fixed market light, market temperature, index detail, sector Top tables, news catalysts, next-session plan, and risk sections. Missing data sources degrade by omitting or simplifying only the affected block.
- Per-stock analysis, realtime quote priority, and sector rankings fallback remain unchanged.
The image uses prebuilt frontend assets under /app/static at runtime, so the running server container does not require the apps/dsa-web source tree or runtime npm. If WebUI cannot be opened after Docker deployment, first verify that /app/static/index.html exists inside the container.
Official image registries:
ghcr.io/zhulinsen/daily_stock_analysis:<tag><DOCKERHUB_USERNAME>/daily_stock_analysis:<tag> (driven by the publisher's DOCKERHUB_USERNAME secret; the official release uses zhulinsen/daily_stock_analysis)# 1. Clone repository
git clone https://github.com/ZhuLinsen/daily_stock_analysis.git
cd daily_stock_analysis
# 2. Configure environment variables
cp .env.example .env
vim .env # Fill in API Keys and configuration
# 3. Start container
docker-compose -f ./docker/docker-compose.yml up -d server # Web service mode (recommended, provides API & WebUI)
docker-compose -f ./docker/docker-compose.yml up -d analyzer # Scheduled task mode
docker-compose -f ./docker/docker-compose.yml up -d # Start both modes
# 4. Access WebUI
# http://localhost:8000
# 5. View logs
docker-compose -f ./docker/docker-compose.yml logs -f server
If you do not want to keep the source tree on the target machine, you can run the published image directly:
# Web/API mode
docker pull zhulinsen/daily_stock_analysis:latest
docker run -d \
--name dsa-server \
--env-file .env \
-p 8000:8000 \
-v "$(pwd)/data:/app/data" \
-v "$(pwd)/logs:/app/logs" \
-v "$(pwd)/reports:/app/reports" \
-v "$(pwd)/.env:/app/.env" \
zhulinsen/daily_stock_analysis:latest \
python main.py --serve-only --host 0.0.0.0 --port 8000
# Scheduled-task mode
docker run -d \
--name dsa-analyzer \
--env-file .env \
-v "$(pwd)/data:/app/data" \
-v "$(pwd)/logs:/app/logs" \
-v "$(pwd)/reports:/app/reports" \
-v "$(pwd)/.env:/app/.env" \
zhulinsen/daily_stock_analysis:latest
For pinned deployments or easier rollback, replace latest with a concrete version tag such as v3.13.0.
| Command | Description | Port |
|---|---|---|
docker-compose -f ./docker/docker-compose.yml up -d server | Web service mode, provides API & WebUI | 8000 |
docker-compose -f ./docker/docker-compose.yml up -d analyzer | Scheduled task mode, daily auto execution | - |
docker-compose -f ./docker/docker-compose.yml up -d | Start both modes simultaneously | 8000 |
docker-compose.yml uses YAML anchors to reuse configuration:
version: '3.8'
x-common: &common
build:
context: ..
dockerfile: docker/Dockerfile
restart: unless-stopped
env_file:
- ../.env
environment:
- TZ=Asia/Shanghai
volumes:
- ../data:/app/data
- ../logs:/app/logs
- ../reports:/app/reports
- ../.env:/app/.env
- ../strategies:/app/strategies:ro
services:
# Scheduled task mode
analyzer:
<<: *common
container_name: stock-analyzer
# FastAPI mode
server:
<<: *common
container_name: stock-server
command: ["python", "main.py", "--serve-only", "--host", "0.0.0.0", "--port", "${API_PORT:-8000}"]
ports:
- "${API_PORT:-8000}:${API_PORT:-8000}"
.env and Volume MappingFor both docker run and Compose, keep these two layers in mind:
--env-file .env or Compose env_file
This passes key/value pairs from .env into the container process environment.-v "$(pwd)/.env:/app/.env" or Compose ../.env:/app/.env
This mounts the same .env file into the container so the Web settings page and backend read/write the same persisted config file.Recommended host mappings:
./data:/app/data for runtime data and database files./logs:/app/logs for logs./reports:/app/reports for generated reports./strategies:/app/strategies:ro for custom strategy YAML filesOptional static asset override:
./static:/app/static:ro# View running status
docker-compose -f ./docker/docker-compose.yml ps
# View logs
docker-compose -f ./docker/docker-compose.yml logs -f server
# Stop services
docker-compose -f ./docker/docker-compose.yml down
# Rebuild image (after code update)
docker-compose -f ./docker/docker-compose.yml build --no-cache
docker-compose -f ./docker/docker-compose.yml up -d server
docker build -f docker/Dockerfile -t stock-analysis .
docker run -d \
--name dsa-server-local \
--env-file .env \
-p 8000:8000 \
-v "$(pwd)/data:/app/data" \
-v "$(pwd)/logs:/app/logs" \
-v "$(pwd)/reports:/app/reports" \
-v "$(pwd)/.env:/app/.env" \
stock-analysis \
python main.py --serve-only --host 0.0.0.0 --port 8000
# Python 3.10+ recommended
pip install -r requirements.txt
# Or use conda
conda create -n stock python=3.10
conda activate stock
pip install -r requirements.txt
python main.py # Full analysis (stocks + market review)
python main.py --market-review # Market review only
python main.py --no-market-review # Stock analysis only
python main.py --stocks 600519,300750 # Specify stocks
python main.py --dry-run # Fetch data only, no AI analysis
python main.py --no-notify # Don't send notifications
python main.py --schedule # Scheduled task mode
python main.py --debug # Debug mode (verbose logging)
python main.py --workers 5 # Specify concurrency
Edit .github/workflows/daily_analysis.yml:
schedule:
# UTC time, Beijing time = UTC + 8
- cron: '0 10 * * 1-5' # Monday to Friday 18:00 (Beijing Time)
Common time reference:
| Beijing Time | UTC cron expression |
|---|---|
| 09:30 | '30 1 * * 1-5' |
| 12:00 | '0 4 * * 1-5' |
| 15:00 | '0 7 * * 1-5' |
| 18:00 | '0 10 * * 1-5' |
| 21:00 | '0 13 * * 1-5' |
# Start scheduled mode (default 18:00 execution)
python main.py --schedule
# Or use crontab
crontab -e
# Add: 0 18 * * 1-5 cd /path/to/project && python main.py
Note: Scheduled mode reloads the saved
STOCK_LISTbefore each run. If you also pass--stocks, it will not pin future scheduled executions to the startup snapshot; use a normal one-off run when you want to analyze a temporary stock list.When the built-in scheduler is started via
python main.py --schedule,python main.py --serve --schedule, or an equivalent local mode, saving a newSCHEDULE_TIMEfrom the WebUI will rebind the daily job on the next scheduler poll without restarting the process. The previous trigger time is removed instead of being kept alongside the new one.
WECHAT_WEBHOOK_URL⚠️ Key distinction:
FEISHU_WEBHOOK_SECRET(webhook signing secret) andFEISHU_APP_SECRET(Feishu App Secret) are two completely different configuration variables and cannot be used interchangeably.
Minimum viable config (no security restrictions):
FEISHU_WEBHOOK_URL=https://open.feishu.cn/open-apis/bot/v2/hook/your_hook_token
Step-by-step setup:
https://open.feishu.cn/open-apis/bot/v2/hook/...)FEISHU_WEBHOOK_URL to the URL you just copied.FEISHU_WEBHOOK_URL is needed.FEISHU_WEBHOOK_SECRET. Both sides must be enabled or disabled together — if Feishu has signing on but FEISHU_WEBHOOK_SECRET is missing (or vice versa), every request will be rejected.FEISHU_WEBHOOK_KEYWORD. The app will prepend it to every message automatically; no need to change report templates.FEISHU_APP_ID / FEISHU_APP_SECRET are for Feishu app / Stream Bot / cloud document flows only — they do not trigger group webhook notifications and must not be used instead of FEISHU_WEBHOOK_URL.Common failure causes:
FEISHU_APP_ID / FEISHU_APP_SECRET were set, but FEISHU_WEBHOOK_URL was not configuredFEISHU_WEBHOOK_SECRET was not set locally (or was mistakenly set to FEISHU_APP_SECRET)FEISHU_WEBHOOK_KEYWORD was not set locallyFEISHU_APP_ID / FEISHU_APP_SECRET / FEISHU_FOLDER_TOKEN)For a full illustrated troubleshooting guide, see docs/bot/feishu-bot-config.md.
TELEGRAM_BOT_TOKEN and TELEGRAM_CHAT_IDTELEGRAM_MESSAGE_THREAD_ID (get from Topic link)EMAIL_SENDER, EMAIL_PASSWORD, EMAIL_RECEIVERSSupported email providers:
Send different stock groups to different email recipients (Issue #268, optional):
Configure STOCK_GROUP_N and EMAIL_GROUP_N to route different stock groups to different inboxes. STOCK_LIST still defines the actual analysis scope, so each STOCK_GROUP_N should be a subset of STOCK_LIST. This only changes email recipients; Telegram, WeChat, Webhook, and other channels still receive the full report for the entire STOCK_LIST. Market review emails are sent to all configured group recipients.
GitHub Actions limitation: as of 2026-03-29, the repository's default
daily_analysis.ymldoes not auto-import arbitrary numberedSTOCK_GROUP_N/EMAIL_GROUP_Nvariables. If you only add them in repository Secrets / Variables without extending the workflowenv:block, they will not reach the runtime process.
STOCK_LIST=600519,300750,002594,AAPL
STOCK_GROUP_1=600519,300750
[email protected]
STOCK_GROUP_2=002594,AAPL
[email protected]
Supports any POST JSON Webhook, including:
Set CUSTOM_WEBHOOK_URLS, separate multiple with commas.
If AstrBot, NapCat, or a self-hosted service requires a custom request body, set
CUSTOM_WEBHOOK_BODY_TEMPLATE. The rendered value must be a JSON object. Prefer
$content_json so newlines and quotes stay valid JSON:
CUSTOM_WEBHOOK_BODY_TEMPLATE={"msg_type":"text","content":$content_json}
Available placeholders: $content_json, $content, $title_json, $title.
Discord supports two push methods:
Method 1: Webhook (Recommended, Simple)
DISCORD_WEBHOOK_URL=https://discord.com/api/webhooks/xxx/yyy
Method 2: Bot API (Requires more permissions)
DISCORD_BOT_TOKEN=your_bot_token
DISCORD_MAIN_CHANNEL_ID=your_channel_id
If you need to receive Discord Slash Command / Interaction callbacks instead of only sending notifications to Discord, also copy the public key from Discord Developer Portal -> General Information -> Public Key and configure:
DISCORD_INTERACTIONS_PUBLIC_KEY=your_public_key
Without this public key, inbound Discord webhook requests are rejected.
Slack supports two push methods. When both are configured, Bot API takes priority to ensure text and images land in the same channel:
Method 1: Bot API (Recommended, supports image upload)
chat:write, files:writeSLACK_BOT_TOKEN=xoxb-...
SLACK_CHANNEL_ID=C01234567
Method 2: Incoming Webhook (Simple setup, text only)
SLACK_WEBHOOK_URL=https://hooks.slack.com/services/T.../B.../xxx
Pushover is a cross-platform push service supporting iOS and Android.
PUSHOVER_USER_KEY=your_user_key
PUSHOVER_API_TOKEN=your_api_token
Features:
System defaults to AkShare (free), also supports other data sources:
TUSHARE_TOKENUse hk prefix for HK stock codes:
STOCK_LIST=600519,hk00700,hk01810
Configure multiple models, system auto-switches:
# Gemini (primary)
GEMINI_API_KEY=xxx
GEMINI_MODEL=gemini-3.1-pro-preview
# OpenAI compatible (backup)
OPENAI_API_KEY=xxx
OPENAI_BASE_URL=https://api.deepseek.com
OPENAI_MODEL=deepseek-v4-flash
# deepseek-chat / deepseek-reasoner remain compatible, but DeepSeek marks them deprecated after 2026/07/24
See LLM Config Guide. Most users only need to think in terms of primary models, fallback models, and channels; this section is for expert users who want direct access to the underlying LiteLLM routing capabilities. No separate Proxy service is required.
Two-layer mechanism: Same-model multi-key rotation (Router) and cross-model fallback are independent.
Multi-key + cross-model fallback example:
# Primary: 3 Gemini keys rotate; Router switches on 429
GEMINI_API_KEYS=key1,key2,key3
LITELLM_MODEL=gemini/gemini-3.1-pro-preview
# Cross-model fallback: when all primary keys fail, try Claude → GPT
# Requires ANTHROPIC_API_KEY, OPENAI_API_KEY
LITELLM_FALLBACK_MODELS=anthropic/claude-sonnet-4-6,openai/gpt-5.4-mini
⚠️
LITELLM_MODELmust include provider prefix (e.g.gemini/,anthropic/,openai/). LegacyGEMINI_MODEL(no prefix) is only used whenLITELLM_MODELis not set.
Vision model (image stock code extraction): See LLM Config Guide - Vision.
python main.py --debug
Log file locations:
logs/stock_analysis_YYYYMMDD.loglogs/stock_analysis_debug_YYYYMMDD.logDebug logs keep the app's own DEBUG messages, but LiteLLM internals default to WARNING to avoid token-level third-party noise during streaming generation. To inspect LiteLLM internals temporarily, set LITELLM_LOG_LEVEL=DEBUG in .env.
For file-based SQLite databases, the app now enables WAL and sets busy_timeout on connection startup. save_daily_data() also uses a batch atomic upsert on (code, date) to reduce lock contention during bulk writes and concurrent callbacks.
You can tune the behavior in .env:
| Variable | Default | Description |
|---|---|---|
SQLITE_WAL_ENABLED | true | Enable journal_mode=WAL for file-based SQLite |
SQLITE_BUSY_TIMEOUT_MS | 5000 | SQLite lock wait timeout in milliseconds |
SQLITE_WRITE_RETRY_MAX | 3 | Max retries for database is locked / database table is locked errors |
SQLITE_WRITE_RETRY_BASE_DELAY | 0.1 | Base backoff delay in seconds for exponential write retries |
The backtesting module automatically validates historical AI analysis records against actual price movements, evaluating the accuracy of analysis recommendations.
AnalysisHistory records past the cooldown period (default 14 days)| Operation Advice | Position | Expected Direction | Win Condition |
|---|---|---|---|
| Buy / Add / Strong Buy | long | up | Return >= neutral band |
| Sell / Reduce / Strong Sell | cash | down | Decline >= neutral band |
| Hold | long | not_down | No significant decline |
| Wait / Observe | cash | flat | Price within neutral band |
Set the following variables in .env (all optional, have defaults):
| Variable | Default | Description |
|---|---|---|
BACKTEST_ENABLED | true | Whether to auto-run backtest after daily analysis |
BACKTEST_EVAL_WINDOW_DAYS | 10 | Evaluation window (trading days) |
BACKTEST_MIN_AGE_DAYS | 14 | Only backtest records older than N days to avoid incomplete data |
BACKTEST_ENGINE_VERSION | v1 | Engine version, used to distinguish results when logic is updated |
BACKTEST_NEUTRAL_BAND_PCT | 2.0 | Neutral band threshold (%), ±2% treated as range-bound |
Backtesting triggers automatically after the daily analysis flow completes (non-blocking; failures do not affect notifications). It can also be triggered manually via API.
| Metric | Description |
|---|---|
direction_accuracy_pct | Direction prediction accuracy (expected direction matches actual) |
win_rate_pct | Win rate (wins / (wins + losses), excludes neutral) |
avg_stock_return_pct | Average stock return percentage |
avg_simulated_return_pct | Average simulated execution return (including SL/TP exits) |
stop_loss_trigger_rate | Stop-loss trigger rate (only counts records with SL configured) |
take_profit_trigger_rate | Take-profit trigger rate (only counts records with TP configured) |
The WebUI and FastAPI API share the same service process. After startup, use the browser workspace for configuration management, manual analysis, task progress, historical reports, backtesting, portfolio management, and smart import. Authentication, cloud-server access, and API usage details are covered below.
FastAPI provides RESTful API service for configuration management and triggering analysis.
| Command | Description |
|---|---|
python main.py --serve | Start API service + run full analysis once |
python main.py --serve-only | Start API service only, manually trigger analysis |
message/progress updates through task SSE/docs for Swagger UI| Endpoint | Method | Description |
|---|---|---|
/api/v1/analysis/analyze | POST | Trigger stock analysis |
/api/v1/analysis/tasks | GET | Query task list |
/api/v1/analysis/tasks/stream | GET (SSE) | Subscribe to realtime task updates |
/api/v1/analysis/status/{task_id} | GET | Query task status |
/api/v1/history | GET | Query analysis history |
| `/api/v1/usage/summary?period=today | month | all` |
/api/v1/backtest/run | POST | Trigger backtest |
/api/v1/backtest/results | GET | Query backtest results (paginated) |
/api/v1/backtest/performance | GET | Get overall backtest performance |
/api/v1/backtest/performance/{code} | GET | Get per-stock backtest performance |
/api/health | GET | Health check |
/docs | GET | API Swagger documentation |
Note:
POST /api/v1/analysis/analyzesupports only one stock whenasync_mode=false; batchstock_codesrequiresasync_mode=true. The async202response returns a singletask_idfor one stock, or anaccepted/duplicatessummary for batch requests.
Progress-stream note:
GET /api/v1/analysis/tasks/streamnow emitstask_progressin addition totask_created / task_started / task_completed / task_failed. The regular analysis path updatesprogressandmessageacross quote preparation, news retrieval, context assembly, LLM generation, and report persistence. Streaming chunks are accumulated only on the server side; history is persisted only after the final JSON parses successfully. If streaming is unavailable before the first chunk, the system falls back to the previous non-stream request. If a stream fails after partial output has already arrived, the system first retries non-stream for the same model, then continues through existing fallback models in the original order (primary + fallback list). If a progress callback fails, the analysis flow continues, and the exception is now logged at warning level to help troubleshoot SSE delivery gaps.
Note: This behavior is documented in the full guide (
full-guide*.md) because it is detailed runtime SSE/fallback behavior and is therefore kept out of the README.
Usage examples:
# Health check
curl http://127.0.0.1:8000/api/health
# Trigger analysis (A-shares)
curl -X POST http://127.0.0.1:8000/api/v1/analysis/analyze \
-H 'Content-Type: application/json' \
-d '{"stock_code": "600519"}'
# Query task status
curl http://127.0.0.1:8000/api/v1/analysis/status/<task_id>
# Query today's LLM usage
curl "http://127.0.0.1:8000/api/v1/usage/summary?period=today"
# Trigger backtest (all stocks)
curl -X POST http://127.0.0.1:8000/api/v1/backtest/run \
-H 'Content-Type: application/json' \
-d '{"force": false}'
# Trigger backtest (specific stock)
curl -X POST http://127.0.0.1:8000/api/v1/backtest/run \
-H 'Content-Type: application/json' \
-d '{"code": "600519", "force": false}'
# Query overall backtest performance
curl http://127.0.0.1:8000/api/v1/backtest/performance
# Query per-stock backtest performance
curl http://127.0.0.1:8000/api/v1/backtest/performance/600519
# Paginated backtest results
curl "http://127.0.0.1:8000/api/v1/backtest/results?page=1&limit=20"
Modify default port or allow LAN access:
python main.py --serve-only --host 0.0.0.0 --port 8888
| Type | Format | Examples |
|---|---|---|
| A-shares | 6-digit number | 600519, 000001, 300750 |
| BSE (Beijing) | 8/4/92 prefix, 6-digit | 920748, 838163, 430047 |
| HK stocks | hk + 5-digit number | hk00700, hk09988 |
http://127.0.0.1:8000 (or your configured port)A: WeChat Work/Feishu have message length limits, system already auto-segments messages. For complete content, configure Feishu Cloud Document feature.
A: AkShare uses scraping mechanism, may be temporarily rate-limited. System has retry mechanism configured, usually just wait a few minutes and retry.
A: Modify STOCK_LIST environment variable, separate multiple codes with commas.
A: Check if Actions is enabled, and if cron expression is correct (note it's UTC time).
/portfolio/portfolio page includes a manual refresh action.POST /api/v1/portfolio/fx/refresh endpoint and reloads snapshot/risk data only.PORTFOLIO_FX_UPDATE_ENABLED=false, the refresh API returns an explicit disabled status and the page shows that online FX refresh is disabled instead of implying that no refreshable pairs exist.positions[] now includes price metadata such as price_source, price_date, price_stale, and price_available. Today's snapshot uses the historical close first and only falls back to realtime quotes when no close exists, while historical as_of snapshots stay on historical-close semantics and no longer silently treat cost basis as the current price. Missing-price positions are marked with price_available=false and excluded from market value / unrealized PnL totals.get_daily_history first tries to reuse local stock_daily daily-bar cache; when the cache is fresh and contains at least the dashboard default of 30 records, it avoids another external data-source request.partial_cache=true, requested_days, and actual_records.stock_daily on a best-effort basis, and write failures do not block the Agent response.search_stock_news and search_comprehensive_intel persist successful results to news_intel on a best-effort basis, reusing the existing URL / fallback-key deduplication logic.get_realtime_quote does not use stock_daily as a realtime-quote cache and does not write intraday quotes into the daily-bar table; realtime quote caching should use a dedicated realtime store if needed.When AGENT_EVENT_MONITOR_ENABLED=true, schedule mode polls the rules in AGENT_EVENT_ALERT_RULES_JSON every AGENT_EVENT_MONITOR_INTERVAL_MINUTES minutes and sends triggered alerts through the existing notification channels. The runtime currently supports three rule types:
Compatibility and rollback note: this section documents current Event Monitor rule behavior (including
price_change_percent) and does not change external model/provider API semantics such as model names, providers, Base URL, LiteLLM,OPENAI_*,DEEPSEEK_*, orGEMINI_*configuration. Rollback is explicit: clear or disableAGENT_EVENT_MONITOR_ENABLED/related rule config to restore previous behavior.
alert_type | Direction | Threshold | Description |
|---|---|---|---|
price_cross | above / below | price | Current price crosses a fixed threshold |
price_change_percent | up / down | change_pct | Intraday change percentage reaches a threshold |
volume_spike | - | multiplier | Latest volume exceeds the recent 20-day average by this multiplier |
Example:
AGENT_EVENT_MONITOR_ENABLED=true
AGENT_EVENT_MONITOR_INTERVAL_MINUTES=5
AGENT_EVENT_ALERT_RULES_JSON=[{"stock_code":"600519","alert_type":"price_cross","direction":"above","price":1800},{"stock_code":"300750","alert_type":"price_change_percent","direction":"down","change_pct":3.0},{"stock_code":"000858","alert_type":"volume_spike","multiplier":2.5}]
For more questions, please submit an Issue