AI hallucination
The broken oracle that misleads.
Definition
AI hallucination is the broken oracle. The machine speaks with total confidence, yet it weaves fiction from the air. It does not lie to deceive you; it lies because it does not know the difference. To trust it blindly is to walk off a cliff believing there is a bridge. You must be the editor of its reality. Confidence is not truth; verify the source.
In training
You can counter this risk with these skills:
critical thinking,
tech literacy.
This risk shows up in these content types:
prompt.
In the news
Latest articles related to this risk:
Explore all risks for more.