apps/web/content/docs/faq/10.ai-models-and-privacy.mdx
This page documents every way Char stores and processes your data. Nothing is hidden. Where possible, we include code snippets from the actual codebase so you can verify each claim yourself.
All core data is stored locally on your device as plain Markdown and JSON files — not in a database. This is a deliberate choice. Files are portable, inspectable, and yours. You can open them in any text editor, back them up however you want, or drop them into an existing vault alongside tools like Obsidian. There is no proprietary format or opaque data layer between you and your data. For more on why we believe in this approach, see The Filesystem Is the Cortex.
Nothing leaves your machine unless you explicitly enable a cloud feature.
Char uses two base directories. See Data for the full directory layout.
Global base (shared across stable and nightly builds):
models/stt/ — downloaded speech-to-text model files (Whisper GGUF, Argmax tarballs)store.json — app state (onboarding status, pinned tabs, recently opened sessions, dismissed toasts, analytics preference, auth tokens)hyprnote.json — vault configuration (custom vault path if set)search/ — Tantivy full-text search indexOn macOS, this is typically ~/Library/Application Support/hyprnote/. On Linux, ~/.local/share/hyprnote/.
The store.json keys are defined by this enum — nothing else is persisted through the Tauri store:
Vault base (customizable, defaults to the global base):
sessions/ — one subdirectory per session containing recorded audio, transcripts, notes, and attachmentshumans/ — contact and participant dataorganizations/ — organization datachats/ — chat conversation dataprompts/ — custom prompt templatessettings.json — your app settingsApplication logs are stored in the system app log directory as rotating files (app.log, app.log.1, etc.).
Each session gets its own subdirectory. Here is how Char loads session content from disk — this shows exactly what files exist per session:
<GithubCode url="https://github.com/fastrepl/char/blob/main/crates/fs-sync-core/src/session_content.rs#L10-L110" />Transcript data uses this structure — word-level timestamps, speaker channels, and optional speaker hints:
<GithubCode url="https://github.com/fastrepl/char/blob/main/crates/fs-sync-core/src/types.rs#L83-L119" />When you start a recording session, Char spawns three actors in parallel:
Here is the session supervisor that orchestrates these actors:
<GithubCode url="https://github.com/fastrepl/char/blob/main/crates/listener-core/src/actors/session/supervisor.rs#L56-L106" />Audio is encoded into MP3 files on your local disk when recording retention is enabled. Here is the recorder handling incoming audio samples:
<GithubCode url="https://github.com/fastrepl/char/blob/main/crates/listener-core/src/actors/recorder/mod.rs#L130-L168" />Audio files are stored at {vault}/sessions/{session_id}/audio.mp3 — they never leave your device unless you explicitly use cloud transcription.
Char stores data as plain Markdown and JSON files on disk — formats you can read, move, and version-control yourself. Char does not currently add its own encryption layer. Your data is protected by your operating system's file permissions and any full-disk encryption you have enabled (such as FileVault on macOS or LUKS on Linux).
We are actively investigating end-to-end encryption (E2EE) to add an additional layer of protection. This would encrypt your data at rest so that only you can decrypt it, independent of OS-level encryption. We do not have a timeline yet, but it is a priority for us.
Char supports optional cloud database sync.
When enabled: Your session data can be synced to a remote database. This is only active if you explicitly configure a cloud database connection.
When not configured: All data stays in local Markdown and JSON files on disk.
The following sections document every case where Char sends data to an external server. If a feature is not listed here, it does not send data externally.
Char supports both local and cloud transcription.
Local models run entirely on your device — your audio never leaves your machine:
For local model details and download instructions, see Local Models.
Cloud models send your audio to the selected provider for processing. Here is how the listener actor connects to your configured STT provider:
<GithubCode url="https://github.com/fastrepl/char/blob/main/crates/listener-core/src/actors/listener/adapters.rs#L22-L102" />The ListenerArgs passed to the STT adapter contain the following — this is all the data sent to the provider along with your audio stream:
What is sent:
Where it goes depends on your setup:
pro.hyprnote.com (our server) and forwarded to a curated STT provider. The proxy does not store your audio.Supported cloud STT providers:
| Provider | Privacy Policy |
|---|---|
| Deepgram | Privacy Policy |
| AssemblyAI | Privacy Policy |
| Soniox | Privacy Policy |
| Gladia | Privacy Policy |
| OpenAI | Privacy Policy |
| ElevenLabs | Privacy Policy |
| DashScope | Privacy Policy |
| Mistral | Privacy Policy |
| Fireworks AI | Privacy Policy |
Char uses LLMs for summaries, enhanced notes, and chat. You can use cloud providers, bring your own key, or run models locally.
Pro Curated Models — Subscribe to Pro for curated cloud AI models that work out of the box.
BYOK (Bring Your Own Key) — Enter your own API key for OpenAI, Anthropic, Google, or Mistral.
Local Models — Run models locally using LM Studio or Ollama. See Local LLM Setup. Recommended: Gemma (Google) and Qwen (Alibaba).
When using cloud-based AI features, your session content is sent to the selected LLM provider. When using local LLMs, everything stays on your device.
Here is how the language model client is created — each provider connects directly to its own API:
<GithubCode url="https://github.com/fastrepl/char/blob/main/apps/desktop/src/ai/hooks/useLLMConnection.ts#L230-L310" />When auto-enhance runs after a session ends, the enhanced result is stored locally:
<GithubCode url="https://github.com/fastrepl/char/blob/main/apps/desktop/src/store/tinybase/persister/session/save/note.ts#L39-L69" />An analytics event is also fired when auto-enhance runs — it includes only the provider and model name, not the content:
<GithubCode url="https://github.com/fastrepl/char/blob/main/apps/desktop/src/services/enhancer/index.ts#L191-L197" />What is sent:
Where it goes depends on your setup:
pro.hyprnote.com and forwarded to a curated LLM provider via OpenRouter. Nothing is stored by our proxy. We have Zero Data Retention (ZDR) enabled on our OpenRouter account, so all requests are routed exclusively to endpoints where the provider does not store your data.Pro users have access to MCP tools for web search and URL reading during AI-assisted note generation.
What is sent:
Where it goes:
Char collects anonymous usage analytics by default to help improve the product. You can disable this entirely in Settings.
Here is the opt-out check — when disabled, the function returns immediately without sending anything:
<GithubCode url="https://github.com/fastrepl/char/blob/main/plugins/analytics/src/ext.rs#L10-L28" />What is attached to every analytics event — this is the complete enrichment logic:
<GithubCode url="https://github.com/fastrepl/char/blob/main/plugins/analytics/src/ext.rs#L45-L76" />What is sent:
hypr_host::fingerprint(), not your name, email, or IP)Where it goes:
The analytics client sends events to both services:
<GithubCode url="https://github.com/fastrepl/char/blob/main/crates/analytics/src/lib.rs#L48-L69" />How to disable: Go to Settings and turn off analytics. When disabled, no analytics events are sent — the is_disabled() check short-circuits the entire flow.
Char uses Sentry for crash reporting and error tracking in release builds. Here is the complete initialization:
<GithubCode url="https://github.com/fastrepl/char/blob/main/apps/desktop/src-tauri/src/lib.rs#L26-L55" />What is sent:
hypr_host::fingerprint()) — no name, email, or IPhyprnote-desktop@{version})service: "hyprnote-desktop"auto_session_tracking is explicitly set to falseWhere it goes:
Char periodically checks if your device is online. Here is the complete implementation:
<GithubCode url="https://github.com/fastrepl/char/blob/main/plugins/network/src/actor.rs#L7-L9" /> <GithubCode url="https://github.com/fastrepl/char/blob/main/plugins/network/src/actor.rs#L83-L95" />What happens:
https://www.google.com/generate_204 every 2 secondsWhen you download a local STT model, Char fetches the model file from a hosting server.
These are the supported local models:
<GithubCode url="https://github.com/fastrepl/char/blob/main/plugins/local-stt/src/model.rs#L4-L15" />What happens:
Char checks for updates automatically in release builds and downloads them in the background when one is available.
What is sent:
Where it goes:
How often:
When you sign in for Pro or cloud features, Char authenticates via Supabase.
What is stored locally:
store.json)What is sent:
pro.hyprnote.com when using Pro featuresChar does not:
To keep meeting content fully local: Use a local STT model for transcription and a local LLM (LM Studio or Ollama) for AI features. Your audio, transcripts, notes, and prompts stay on your device. Background network traffic can still come from the connectivity check, update checks/downloads, analytics if enabled, and crash reporting in release builds.
To maximize privacy:
Char is open source. You can verify everything documented here by reading the code: