299. Hallucination

Instances where AI models generate information that is false or not grounded in the input data.

Last updated