What Is Hallucination in AI? Meaning and Real-World Impacts

A hallucination in AI happens when a model generates a response that’s inaccurate, misleading, or entirely fabricated. Even powerful models like GPT-3 and GPT-4 can hallucinate, especially when prompted with unclear or ambiguous input. In HR and workplace tools, hallucinations can affect trust and accuracy. This glossary entry breaks down the concept and shows how Winslow works to reduce these errors.