148. Hallucination
When an AI model generates information that seems plausible but is factually incorrect or nonsensical.
Last updated
When an AI model generates information that seems plausible but is factually incorrect or nonsensical.
Last updated