Back to Developer Roadmap

Hallucination

src/data/roadmaps/prompt-engineering/content/[email protected]

4.0387 B
Original Source

Hallucination

Hallucination in LLMs refers to generating plausible-sounding but factually incorrect or fabricated information. This occurs when models fill knowledge gaps or present uncertain information with apparent certainty. Mitigation techniques include requesting sources, asking for confidence levels, providing context, and always verifying critical information independently.