apps/www/_blog/2025-06-25-natural-db.mdx
Large Language Models are excellent at transforming unstructured text into structured data, but they face challenges when it comes to accurately retrieving that data over extended conversations. In this post, we'll leverage this core strength and combine it with Postgres, along with several complementary tools, to build a personalized AI assistant capable of long-term memory retention.
At a high level, the system's flexibility is created by combining these core building blocks: An LLM owned database schema through an execute_sql tool, scheduled tasks for autonomy, web searches for real-time information, and MCP integrations for extended actions that may integrate with external tools.
See it at work in the video below.
<video className="rounded-sm m-0" autoPlay playsInline loop muted allowfullscreen> <source src="https://xguihxuzqibwxjnimxev.supabase.co/storage/v1/object/public/videos/marketing/blog/natural-db/natural-db-demo-combined.mp4" type="video/mp4" /> </video>The assistant uses a dedicated Postgres schema called memories to store all of its structured data. To ensure security, the LLM operates under a specific role, memories_role, which is granted permissions only within this schema.
memories schema by calling an execute_sql toolpublic, are inaccessible to the LLM.Three complementary memory types maintain conversation continuity:
The system achieves autonomy through scheduled prompts which are powered by pg_cron through a dedicated tool. Scheduled prompts call the same edge functions as a normal prompt via pg_net and can therefore use all the same tools.
Example: "Every Sunday at 6 PM, analyze my portfolio performance and research market trends"
The system leverages built-in web search capabilities from LLMs like OpenAI's web search tool to access real-time information and current events.
-- Auto-generated from web search results
CREATE TABLE research_findings (
topic TEXT,
source_url TEXT,
key_insights TEXT[],
credibility_score INTEGER,
search_date TIMESTAMPTZ DEFAULT NOW()
);
Through Zapier's MCP integration, your assistant can:
The system uses a Telegram Bot as the default interface which calls an edge function via webhook. You can change this to whatever interface you want, for example a web page, voice or other.
The assistant maintains two behavioral layers:
When you say "be more formal" or "address me by name," these preferences are stored with version history and persist across all conversations, creating a personalized experience.
Prompt: "Help me track my daily runs by sending me a reminder each morning with details on my previous days run"
runs table to store distance, duration, route, weather conditions, and personal notes for each runPrompt: "Help me track my meals and suggest recipes based on what I have in my kitchen"
recipes, ingredients, meal_history, and meal_ratings tables to store cooking experiences, dietary preferences, and meal satisfactionPrompt: "Help me track customer feedback by analyzing support tickets daily and giving me weekly summaries"
feedback table to store ticket analysis, themes, sentiment scores, and product areasPrompt: "Help me track interesting articles about AI and climate change, reminding me of important ones I haven't read"
articles table to store article metadata, read status, relevance scores, and user interestsIf you prefer the command line, you can use the Supabase CLI to set up your database and Edge Functions. This replaces Step 1 and Step 2.
git clone https://github.com/supabase-community/natural-db.git
cd natural-db
supabase login
supabase link --project-ref <YOUR-PROJECT-ID>
supabase db push
supabase functions deploy --no-verify-jwt
After completing these steps, you can proceed to Step 3: Telegram Bot.
Run the migration SQL in your Supabase SQL editor: migration.sql
pgvector and pg_cron.memories schema for the assistant's data.memories_role with scoped permissions to the memories schema.Create three functions in Supabase dashboard:
natural-db: Main AI brain handling all processing, database operations, scheduling, and tool integration
telegram-input: Webhook handler for incoming messages with user validation and timezone management
telegram-outgoing: Response formatter and delivery handler with error management
https://api.telegram.org/bot[TOKEN]/setWebhook?url=https://[PROJECT].supabase.co/functions/v1/telegram-inputSet the following environment variables in your Supabase project settings (Project Settings → Edge Functions):
OPENAI_API_KEY: Your OpenAI API keyTELEGRAM_BOT_TOKEN: Bot token from @BotFatherALLOWED_USERNAMES: Comma-separated list of allowed Telegram usernamesTELEGRAM_WEBHOOK_SECRET: Secret token for webhook validationOPENAI_MODEL: OpenAI model to use (defaults to "gpt-4.1-mini")ZAPIER_MCP_URL: MCP server URL for Zapier integrationsTry these commands with your bot:
Based on 10 messages per day (300 messages/month):
Total monthly cost: ~$0.54
This project showcases how combining modular components—with LLMs as just one piece—can create systems that are greater than the sum of their parts. I hope this inspires you to build and deploy your own personalized AI assistant while maintaining full control over your code and data. For additional inspiration, check out this excellent post by Geoffrey Litt.
Ready to build your own AI assistant? Check out the GitHub repository to get started, contribute improvements, or share your own use cases.