Live 👋 Hello Product Hunters! We're live on PH today!
Support us on PH
Technical Concept

Hallucination

What is Hallucination?

Hallucination occurs when an AI system generates information that seems plausible but is actually incorrect or made up. This happens because AI models predict patterns based on their training data rather than accessing factual knowledge. It matters because users might trust and act on false information without realizing it's inaccurate.

Technical Details

Hallucinations stem from statistical pattern generation in models like GPT and diffusion models, where the model maximizes likelihood without ground truth verification. They're common in autoregressive architectures and can be mitigated through techniques like reinforcement learning from human feedback.

Real-World Example

When using ChatGPT, you might ask about a historical event and receive a detailed but completely fictional account with made-up dates and names that sound convincing but have no basis in reality.

AI Tools That Use Hallucination

Related Terms

Want to learn more about AI?

Explore our complete glossary of AI terms or compare tools that use Hallucination.