Back to Localgpt

πŸ“œ Prompt Inventory (Ground-Truth)

Documentation/prompt_inventory.md

latest3.3 KB
Original Source

πŸ“œ Prompt Inventory (Ground-Truth)

All generation / verification prompts currently hard-coded in the codebase.
Last updated: 2025-07-06

Edit process: if you change a prompt in code, please update this file or, once we migrate to the central registry, delete the entry here.


1. Indexing / Context Enrichment

IDFile & LinesVariable / BuilderPurpose
overview_builder.defaultrag_system/indexing/overview_builder.py 12-21DEFAULT_PROMPTGenerate 1-paragraph document overview for search-time routing.
contextualizer.systemrag_system/indexing/contextualizer.py 11SYSTEM_PROMPTSystem instruction: explain summarisation role.
contextualizer.local_contextsame file 13-15LOCAL_CONTEXT_PROMPT_TEMPLATEHuman message – wraps neighbouring chunks.
contextualizer.chunksame file 17-19CHUNK_PROMPT_TEMPLATEHuman message – shows the target chunk.
graph_extractor.entitiesrag_system/indexing/graph_extractor.py 20-31entity_promptAsk LLM to list entities.
graph_extractor.relationshipssame file 53-64relationship_promptAsk LLM to list relationships.

2. Retrieval / Query Transformation

IDFile & LinesPurpose
query_transformer.expandrag_system/retrieval/query_transformer.py 10-26Produce query rewrites (keywords, boolean).
hyde.hypothetical_docsame 115-122HyDE hypothetical document generator.
graph_query.translatesame 124-140Translate user question to JSON KG query.

3. Pipeline Answer Synthesis

IDFile & LinesPurpose
retrieval_pipeline.synth_finalrag_system/pipelines/retrieval_pipeline.py 217-256Turn verified facts into answer (with directives 1-6).

4. Agent – Classical Loop

IDFile & LinesPurpose
agent.loop.initial_thoughtrag_system/agent/loop.py 157-180First LLM call to think about query.
agent.loop.verify_pathsame 190-205Secondary thought loop.
agent.loop.compose_subsame 506-542Compose answer from sub-answers.
agent.loop.routersame 648-660Decide which subsystem handles query.

5. Verifier

IDFile & LinesPurpose
verifier.fact_checkrag_system/agent/verifier.py 18-58Strict JSON-format grounding verifier.

6. Backend Router (Fast path)

IDFile & LinesPurpose
backend.routerbackend/server.py 435-448Decide "RAG vs direct LLM" before heavy processing.

7. Miscellaneous

IDFile & LinesPurpose
vision.placeholderrag_system/utils/ollama_client.py 169Dummy prompt for VLM colour check.

Missing / To-Do

  1. Verify whether ReActAgent.PROMPT_TEMPLATE captures every placeholder – some earlier lines may need explicit ID when we move to central registry.
  2. Search TS/JS code once the backend prompts are ported (currently none).

Next step: create rag_system/prompts/registry.yaml and start moving each prompt above into a key–value entry with identical IDs. Update callers gradually using the helper proposed earlier.