Back to Open Notebook

Complete Environment Reference

docs/5-CONFIGURATION/environment-reference.md

1.8.511.3 KB
Original Source

Complete Environment Reference

Comprehensive list of all environment variables available in Open Notebook.


API Configuration

VariableRequired?DefaultDescription
API_URLNoAuto-detectedURL where frontend reaches API (e.g., http://localhost:5055)
INTERNAL_API_URLNohttp://localhost:5055Internal API URL for Next.js server-side proxying
API_CLIENT_TIMEOUTNo300Client timeout in seconds (how long to wait for API response)
OPEN_NOTEBOOK_PASSWORDNoNonePassword to protect Open Notebook instance
OPEN_NOTEBOOK_ENCRYPTION_KEYYesNoneSecret string to encrypt credentials stored in database (any string works). Required for the credential system. Supports Docker secrets via _FILE suffix.
HOSTNAMENo0.0.0.0 (in Docker)Network interface for Next.js to bind to. Default 0.0.0.0 ensures accessibility from reverse proxies

Important: OPEN_NOTEBOOK_ENCRYPTION_KEY is required for storing AI provider credentials via the Settings UI. Without it, you cannot save credentials. If you change or lose this key, all stored credentials become unreadable.


Database: SurrealDB

VariableRequired?DefaultDescription
SURREAL_URLYesws://surrealdb:8000/rpcSurrealDB WebSocket connection URL
SURREAL_USERYesrootSurrealDB username
SURREAL_PASSWORDYesrootSurrealDB password
SURREAL_NAMESPACEYesopen_notebookSurrealDB namespace
SURREAL_DATABASEYesopen_notebookSurrealDB database name

Database: Retry Configuration

VariableRequired?DefaultDescription
SURREAL_COMMANDS_RETRY_ENABLEDNotrueEnable retries on failure
SURREAL_COMMANDS_RETRY_MAX_ATTEMPTSNo3Maximum retry attempts
SURREAL_COMMANDS_RETRY_WAIT_STRATEGYNoexponential_jitterRetry wait strategy (exponential_jitter/exponential/fixed/random)
SURREAL_COMMANDS_RETRY_WAIT_MINNo1Minimum wait time between retries (seconds)
SURREAL_COMMANDS_RETRY_WAIT_MAXNo30Maximum wait time between retries (seconds)

Database: Concurrency

VariableRequired?DefaultDescription
SURREAL_COMMANDS_MAX_TASKSNo5Maximum concurrent database tasks

LLM Timeouts

VariableRequired?DefaultDescription
ESPERANTO_LLM_TIMEOUTNo60LLM inference timeout in seconds
ESPERANTO_SSL_VERIFYNotrueVerify SSL certificates (false = development only)
ESPERANTO_SSL_CA_BUNDLENoNonePath to custom CA certificate bundle

Text-to-Speech (TTS)

VariableRequired?DefaultDescription
TTS_BATCH_SIZENo5Concurrent TTS requests (1-5, depends on provider)

Content Extraction

VariableRequired?DefaultDescription
FIRECRAWL_API_KEYNoNoneFirecrawl API key for advanced web scraping
JINA_API_KEYNoNoneJina AI API key for web extraction

Setup:


Network / Proxy

VariableRequired?DefaultDescription
HTTP_PROXYNoNoneHTTP proxy URL for outbound HTTP requests
HTTPS_PROXYNoNoneHTTPS proxy URL for outbound HTTPS requests
NO_PROXYNoNoneComma-separated list of hosts to bypass proxy

Route all outbound HTTP requests through a proxy server. Useful for corporate/firewalled environments.

The underlying libraries (esperanto, content-core, podcast-creator) automatically detect proxy settings from these standard environment variables.

Affects:

  • AI provider API calls (OpenAI, Anthropic, Google, Groq, etc.)
  • Content extraction from URLs (web scraping, YouTube transcripts)
  • Podcast generation (LLM and TTS provider calls)

Format: http://[user:pass@]host:port or https://[user:pass@]host:port

Examples:

bash
# Basic proxy
HTTP_PROXY=http://proxy.corp.com:8080
HTTPS_PROXY=http://proxy.corp.com:8080

# Authenticated proxy
HTTP_PROXY=http://user:[email protected]:8080
HTTPS_PROXY=http://user:[email protected]:8080

# Bypass proxy for local hosts
NO_PROXY=localhost,127.0.0.1,.local

Debugging & Monitoring

VariableRequired?DefaultDescription
LANGCHAIN_TRACING_V2NofalseEnable LangSmith tracing
LANGCHAIN_ENDPOINTNohttps://api.smith.langchain.comLangSmith endpoint
LANGCHAIN_API_KEYNoNoneLangSmith API key
LANGCHAIN_PROJECTNoOpen NotebookLangSmith project name

Setup: https://smith.langchain.com/


Environment Variables by Use Case

Minimal Setup (New Installation)

OPEN_NOTEBOOK_ENCRYPTION_KEY=my-secret-key
SURREAL_URL=ws://surrealdb:8000/rpc
SURREAL_USER=root
SURREAL_PASSWORD=password
SURREAL_NAMESPACE=open_notebook
SURREAL_DATABASE=open_notebook

Then configure AI providers via Settings → API Keys in the browser.

Production Deployment

OPEN_NOTEBOOK_ENCRYPTION_KEY=your-strong-secret-key
OPEN_NOTEBOOK_PASSWORD=your-secure-password
API_URL=https://mynotebook.example.com
SURREAL_USER=production_user
SURREAL_PASSWORD=secure_password

Self-Hosted Behind Reverse Proxy

OPEN_NOTEBOOK_ENCRYPTION_KEY=your-secret-key
API_URL=https://mynotebook.example.com

Corporate Environment (Behind Proxy)

OPEN_NOTEBOOK_ENCRYPTION_KEY=your-secret-key
HTTP_PROXY=http://proxy.corp.com:8080
HTTPS_PROXY=http://proxy.corp.com:8080
NO_PROXY=localhost,127.0.0.1

High-Performance Deployment

OPEN_NOTEBOOK_ENCRYPTION_KEY=your-secret-key
SURREAL_COMMANDS_MAX_TASKS=10
TTS_BATCH_SIZE=5
API_CLIENT_TIMEOUT=600

Debugging

OPEN_NOTEBOOK_ENCRYPTION_KEY=your-secret-key
LANGCHAIN_TRACING_V2=true
LANGCHAIN_API_KEY=your-key

Validation

Check if a variable is set:

bash
# Check single variable
echo $OPEN_NOTEBOOK_ENCRYPTION_KEY

# Check multiple
env | grep -E "OPEN_NOTEBOOK|API_URL"

# Print all config
env | grep -E "^[A-Z_]+=" | sort

Notes

  • Case-sensitive: OPEN_NOTEBOOK_ENCRYPTION_KEYopen_notebook_encryption_key
  • No spaces: OPEN_NOTEBOOK_ENCRYPTION_KEY=my-key not OPEN_NOTEBOOK_ENCRYPTION_KEY = my-key
  • Quote values: Use quotes for values with spaces: API_URL="http://my server:5055"
  • Restart required: Changes take effect after restarting services
  • Secrets: Don't commit encryption keys or passwords to git
  • AI Providers: Configure via Settings → API Keys in the browser (not via env vars)
  • Migration: Use Settings UI to migrate existing env vars to the credential system. See API Configuration

Quick Setup Checklist

  • Set OPEN_NOTEBOOK_ENCRYPTION_KEY in docker-compose.yml
  • Set database credentials (SURREAL_*)
  • Start services
  • Open browser → Go to Settings → API Keys
  • Add Credential for your AI provider
  • Test Connection to verify
  • Discover & Register Models
  • Set API_URL if behind reverse proxy
  • Change SURREAL_PASSWORD in production
  • Try a test chat

Done!


Legacy: AI Provider Environment Variables (Deprecated)

Deprecated: The following AI provider API key environment variables are deprecated. Configure providers via the Settings UI instead. These variables may still work as a fallback but are no longer recommended.

If you have these variables configured from a previous installation, click the Migrate to Database button in Settings → API Keys to import them into the credential system, then remove them from your configuration.

VariableProviderReplacement
OPENAI_API_KEYOpenAISettings → API Keys → Add OpenAI Credential
ANTHROPIC_API_KEYAnthropicSettings → API Keys → Add Anthropic Credential
GOOGLE_API_KEYGoogle GeminiSettings → API Keys → Add Google Credential
GEMINI_API_BASE_URLGoogle GeminiConfigure in Google Gemini credential
VERTEX_PROJECTVertex AISettings → API Keys → Add Vertex AI Credential
VERTEX_LOCATIONVertex AIConfigure in Vertex AI credential
GOOGLE_APPLICATION_CREDENTIALSVertex AIConfigure in Vertex AI credential
GROQ_API_KEYGroqSettings → API Keys → Add Groq Credential
MISTRAL_API_KEYMistralSettings → API Keys → Add Mistral Credential
DEEPSEEK_API_KEYDeepSeekSettings → API Keys → Add DeepSeek Credential
XAI_API_KEYxAISettings → API Keys → Add xAI Credential
OLLAMA_API_BASEOllamaSettings → API Keys → Add Ollama Credential
OPENROUTER_API_KEYOpenRouterSettings → API Keys → Add OpenRouter Credential
OPENROUTER_BASE_URLOpenRouterConfigure in OpenRouter credential
VOYAGE_API_KEYVoyage AISettings → API Keys → Add Voyage AI Credential
ELEVENLABS_API_KEYElevenLabsSettings → API Keys → Add ElevenLabs Credential
OPENAI_COMPATIBLE_BASE_URLOpenAI-CompatibleSettings → API Keys → Add OpenAI-Compatible Credential
OPENAI_COMPATIBLE_API_KEYOpenAI-CompatibleConfigure in OpenAI-Compatible credential
OPENAI_COMPATIBLE_BASE_URL_LLMOpenAI-CompatibleConfigure per-service URL in credential
OPENAI_COMPATIBLE_API_KEY_LLMOpenAI-CompatibleConfigure per-service key in credential
OPENAI_COMPATIBLE_BASE_URL_EMBEDDINGOpenAI-CompatibleConfigure per-service URL in credential
OPENAI_COMPATIBLE_API_KEY_EMBEDDINGOpenAI-CompatibleConfigure per-service key in credential
OPENAI_COMPATIBLE_BASE_URL_STTOpenAI-CompatibleConfigure per-service URL in credential
OPENAI_COMPATIBLE_API_KEY_STTOpenAI-CompatibleConfigure per-service key in credential
OPENAI_COMPATIBLE_BASE_URL_TTSOpenAI-CompatibleConfigure per-service URL in credential
OPENAI_COMPATIBLE_API_KEY_TTSOpenAI-CompatibleConfigure per-service key in credential
DASHSCOPE_API_KEYDashScope (Qwen)Settings → API Keys → Add DashScope Credential
MINIMAX_API_KEYMiniMaxSettings → API Keys → Add MiniMax Credential
AZURE_OPENAI_API_KEYAzure OpenAISettings → API Keys → Add Azure OpenAI Credential
AZURE_OPENAI_ENDPOINTAzure OpenAIConfigure in Azure OpenAI credential
AZURE_OPENAI_API_VERSIONAzure OpenAIConfigure in Azure OpenAI credential
AZURE_OPENAI_API_KEY_LLMAzure OpenAIConfigure per-service in credential
AZURE_OPENAI_ENDPOINT_LLMAzure OpenAIConfigure per-service in credential
AZURE_OPENAI_API_VERSION_LLMAzure OpenAIConfigure per-service in credential
AZURE_OPENAI_API_KEY_EMBEDDINGAzure OpenAIConfigure per-service in credential
AZURE_OPENAI_ENDPOINT_EMBEDDINGAzure OpenAIConfigure per-service in credential
AZURE_OPENAI_API_VERSION_EMBEDDINGAzure OpenAIConfigure per-service in credential