packages/sync/README.md
Sync your Screenpipe data to remote AI agents.
One-liner to sync daily summaries to a remote server (e.g., OpenClaw). Uses Claude Code CLI for AI-powered extraction.
# AI summary to stdout
bunx @screenpipe/sync
# Save daily summaries locally
bunx @screenpipe/sync --output ~/Documents/brain/context --git
# Sync summaries to remote server
bunx @screenpipe/sync --output /tmp/summaries --remote user@host:~/screenpipe-pkm
# Persistent daemon (survives reboot)
bunx @screenpipe/sync --daemon --output /tmp/summaries --remote user@host:~/clawd/screenpipe-pkm
| Category | Description |
|---|---|
| Todos | Action items visible on screen or mentioned |
| Goals | Objectives, intentions, targets mentioned |
| Decisions | Choices made or discussed |
| Activities | Key tasks worked on, by app |
| Meetings | Calls, conversations, collaborations |
| Blockers | Problems, frustrations, obstacles |
| Insights | AI observations about work patterns |
# Daily Context - 2026-01-31
> Analyzed 480 minutes of screen activity
## Apps Used
- **VS Code**: ~180 min
- **Chrome**: ~120 min
- **Slack**: ~60 min
## Todos Extracted
- Fix authentication bug in login.ts
- Review PR #234 for payment integration
- Send weekly update to investors
## Goals Mentioned
- Ship v2.9 by Friday
- Reach 50% activation rate
## AI Insights
- Heavy context switching between Slack and VS Code
- Deep focus block from 2-4pm on auth refactor
| Flag | Description | Default |
|---|---|---|
-o, --output <dir> | Save to directory | stdout |
-h, --hours <n> | Hours to analyze | 12 |
-g, --git | Auto commit & push | false |
-r, --remote <host> | Sync via SSH | - |
-d, --daemon | Install persistent sync | false |
--interval <secs> | Daemon interval | 3600 |
--stop | Stop daemon | - |
--json | JSON output | markdown |
-v, --verbose | Debug output | false |
Uses Claude Code CLI if available:
# Install Claude Code CLI
npm install -g @anthropic-ai/claude-code
# Verify it works
claude --version
Falls back to structured extraction (no AI) if CLI not found.
For daily digests via Telegram:
bunx @screenpipe/sync --daemon --output /tmp/summaries --remote openclaw:~/clawd/screenpipe-pkm
# Already configured - runs at 9pm PT
sudo docker exec openclaw-gateway node dist/index.js cron list
bunx @screenpipe/sync --output ~/journal --hours 16
bunx @screenpipe/sync --hours 168 --json > week.json
# Install persistent sync - survives reboot
bunx @screenpipe/sync --daemon -r user@host:~/screenpipe-pkm
# Check status
# macOS: cat /tmp/screenpipe-sync.log
# Linux: systemctl --user status screenpipe-sync.timer
# Stop daemon
bunx @screenpipe/sync --stop
All processing happens locally. Screen data never leaves your machine unless you explicitly sync summaries to a remote.
MIT - Part of the Screenpipe project.