docs/DEPLOY_EN.md
This document explains how to deploy the AI Stock Analysis System to a server.
| Option | Pros | Cons | Recommended For |
|---|---|---|---|
| Docker Compose ⭐ | One-click deploy, isolated environment, easy migration, easy upgrade | Requires Docker installation | Recommended: Most scenarios |
| Direct Deployment | Simple, no extra dependencies | Environment dependencies, migration difficulties | Temporary testing |
| Systemd Service | System-level management, auto-start on boot | Complex configuration | Long-term stable operation |
| Supervisor | Process management, auto-restart | Requires additional installation | Multi-process management |
Conclusion: Docker Compose is recommended for the fastest and most convenient migration!
# Ubuntu/Debian
curl -fsSL https://get.docker.com | sh
sudo usermod -aG docker $USER
# CentOS
sudo yum install -y docker docker-compose
sudo systemctl start docker
sudo systemctl enable docker
# Clone code (or upload code to server)
git clone <your-repo-url> /opt/stock-analyzer
cd /opt/stock-analyzer
# Copy and edit configuration file
cp .env.example .env
vim .env # Fill in real API Keys and configuration
# Build and start
docker-compose -f ./docker/docker-compose.yml up -d
# View logs
docker-compose -f ./docker/docker-compose.yml logs -f
# View running status
docker-compose -f ./docker/docker-compose.yml ps
# Stop services
docker-compose -f ./docker/docker-compose.yml down
# Restart services
docker-compose -f ./docker/docker-compose.yml restart
# Redeploy after code update
git pull
docker-compose -f ./docker/docker-compose.yml build --no-cache
docker-compose -f ./docker/docker-compose.yml up -d
# Enter container for debugging
docker-compose -f ./docker/docker-compose.yml exec stock-analyzer bash
# Manually run analysis once
docker-compose -f ./docker/docker-compose.yml exec stock-analyzer python main.py --no-notify
Data is automatically saved to host directories:
./data/ - Database files./logs/ - Log files./reports/ - Analysis reports# Install Python 3.10+
sudo apt update
sudo apt install -y python3.10 python3.10-venv python3-pip
# Create virtual environment
python3.10 -m venv /opt/stock-analyzer/venv
source /opt/stock-analyzer/venv/bin/activate
cd /opt/stock-analyzer
pip install -r requirements.txt -i https://pypi.tuna.tsinghua.edu.cn/simple
cp .env.example .env
vim .env # Fill in configuration
# Single run
python main.py
# Scheduled task mode (foreground)
python main.py --schedule
# Background run (using nohup)
nohup python main.py --schedule > /dev/null 2>&1 &
Create systemd service file for auto-start on boot and auto-restart:
sudo vim /etc/systemd/system/stock-analyzer.service
Contents:
[Unit]
Description=AI Stock Analysis System
After=network.target
[Service]
Type=simple
User=root
WorkingDirectory=/opt/stock-analyzer
Environment="PATH=/opt/stock-analyzer/venv/bin"
ExecStart=/opt/stock-analyzer/venv/bin/python main.py --schedule
Restart=always
RestartSec=30
[Install]
WantedBy=multi-user.target
# Reload configuration
sudo systemctl daemon-reload
# Start service
sudo systemctl start stock-analyzer
# Enable auto-start on boot
sudo systemctl enable stock-analyzer
# View status
sudo systemctl status stock-analyzer
# View logs
journalctl -u stock-analyzer -f
| Config Item | Description | How to Get |
|---|---|---|
ANSPIRE_API_KEYS / AIHUBMIX_KEY / GEMINI_API_KEY / ANTHROPIC_API_KEY / OPENAI_API_KEY | Configure at least one AI model key; Anspire or AIHubMix is recommended first | Provider console |
STOCK_LIST | Watchlist | Comma-separated stock codes |
| Notification channel | Configure at least one, such as WeChat Work, Feishu, Telegram, or email | Notification provider |
| Config Item | Default | Description |
|---|---|---|
SCHEDULE_ENABLED | false | Enable scheduled tasks |
SCHEDULE_TIME | 18:00 | Daily execution time |
MARKET_REVIEW_ENABLED | true | Enable market review |
ANSPIRE_API_KEYS | - | Anspire LLM and news search (recommended) |
AIHUBMIX_KEY | - | AIHubMix one-key multi-model access (recommended) |
SERPAPI_API_KEYS | - | SerpAPI realtime financial news search (recommended) |
TAVILY_API_KEYS | - | Tavily news search (optional) |
MINIMAX_API_KEYS | - | MiniMax search (optional) |
If server is in mainland China, accessing Gemini API requires proxy:
Edit docker-compose.yml:
environment:
- http_proxy=http://your-proxy:port
- https_proxy=http://your-proxy:port
Edit top of main.py:
os.environ["http_proxy"] = "http://your-proxy:port"
os.environ["https_proxy"] = "http://your-proxy:port"
# Docker method
docker-compose -f ./docker/docker-compose.yml logs -f --tail=100
# Direct deployment
tail -f /opt/stock-analyzer/logs/stock_analysis_*.log
# Check process
ps aux | grep main.py
# Check recent reports
ls -la /opt/stock-analyzer/reports/
# Clean old logs (keep 7 days)
find /opt/stock-analyzer/logs -mtime +7 -delete
# Clean old reports (keep 30 days)
find /opt/stock-analyzer/reports -mtime +30 -delete
# Clear cache and rebuild
docker-compose -f ./docker/docker-compose.yml build --no-cache
Check proxy configuration, ensure server can access Gemini API.
# Stop service then delete lock file
rm /opt/stock-analyzer/data/*.lock
Adjust memory limits in docker-compose.yml:
deploy:
resources:
limits:
memory: 1G
Migrate from one server to another:
# Source server: Package
cd /opt/stock-analyzer
tar -czvf stock-analyzer-backup.tar.gz .env data/ logs/ reports/
# Target server: Deploy
mkdir -p /opt/stock-analyzer
cd /opt/stock-analyzer
git clone <your-repo-url> .
tar -xzvf stock-analyzer-backup.tar.gz
docker-compose -f ./docker/docker-compose.yml up -d
The simplest option! No server needed, leverages GitHub's free compute resources.
# Initialize git (if not already)
cd /path/to/daily_stock_analysis
git init
git add .
git commit -m "Initial commit"
# Create GitHub repo and push
# After creating new repo on GitHub web:
git remote add origin https://github.com/your-username/daily_stock_analysis.git
git branch -M main
git push -u origin main
Go to repo page → Settings → Secrets and variables → Actions → New repository secret
Add these Secrets:
| Secret Name | Description | Required |
|---|---|---|
ANSPIRE_API_KEYS | Anspire Open API Key (one key for LLM and search) | Recommended |
AIHUBMIX_KEY | AIHubMix API Key (one key for multiple model families) | Recommended |
ANTHROPIC_API_KEY | Anthropic API Key | Optional |
GEMINI_API_KEY | Gemini AI API Key | Optional |
OPENAI_API_KEY | OpenAI-compatible API Key | Optional |
WECHAT_WEBHOOK_URL | WeChat Work Bot Webhook | Optional* |
FEISHU_WEBHOOK_URL | Feishu Bot Webhook | Optional* |
TELEGRAM_BOT_TOKEN | Telegram Bot Token | Optional* |
TELEGRAM_CHAT_ID | Telegram Chat ID | Optional* |
TELEGRAM_MESSAGE_THREAD_ID | Telegram Topic ID | Optional* |
EMAIL_SENDER | Sender email | Optional* |
EMAIL_PASSWORD | Email authorization code | Optional* |
SERVERCHAN3_SENDKEY | ServerChan v3 Sendkey | Optional* |
CUSTOM_WEBHOOK_URLS | Custom Webhook (comma-separated for multiple) | Optional* |
STOCK_LIST | Watchlist, e.g., 600519,300750 | ✅ |
SERPAPI_API_KEYS | SerpAPI Key | Recommended |
TAVILY_API_KEYS | Tavily Search API Key | Optional |
BOCHA_API_KEYS | Bocha Search API Key | Optional |
BRAVE_API_KEYS | Brave Search API Key | Optional |
MINIMAX_API_KEYS | MiniMax Coding Plan Web Search | Optional |
TUSHARE_TOKEN | Tushare Token | Optional |
GEMINI_MODEL | Model name (default gemini-2.0-flash) | Optional |
*Note: Configure at least one notification channel, multiple channels supported for simultaneous push
Ensure .github/workflows/daily_analysis.yml file exists and is committed:
git add .github/workflows/daily_analysis.yml
git commit -m "Add GitHub Actions workflow"
git push
full - Full analysis (stocks + market)market-only - Market review onlystocks-only - Stock analysis onlyDefault configuration: Monday to Friday, 18:00 Beijing Time auto-execution
Modify time: Edit cron expression in .github/workflows/daily_analysis.yml:
schedule:
- cron: '0 10 * * 1-5' # UTC time, +8 = Beijing time
Common cron examples:
| Expression | Description |
|---|---|
'0 10 * * 1-5' | Mon-Fri 18:00 (Beijing) |
'30 7 * * 1-5' | Mon-Fri 15:30 (Beijing) |
'0 10 * * *' | Daily 18:00 (Beijing) |
'0 2 * * 1-5' | Mon-Fri 10:00 (Beijing) |
Method 1: Modify repo Secret STOCK_LIST
Method 2: Modify code directly then push:
# Modify .env.example or set default value in code
git commit -am "Update stock list"
git push
Q: Why isn't the scheduled task running? A: GitHub Actions scheduled tasks may have 5-15 minute delays, and only trigger when repo has activity. Long periods without commits may cause workflow to be disabled.
Q: How to view historical reports?
A: Actions → Select run record → Artifacts → Download analysis-reports-xxx
Q: Is the free quota enough? A: Each run takes about 2-5 minutes, 22 workdays per month = 44-110 minutes, well below the 2000 minute limit.
Wishing you a smooth deployment!