Discussion about this post

User's avatar
Chavita Rooks's avatar

Hi DiamantAI!

I enjoyed reading your content about LLM hallucinations. While it's well-known that LLMs can hallucinate, your article aligns more closely with the concept of "confabulation."

Confabulation is understood in psychology and has been confirmed by Geoffrey Hinton as "honest lying." He discussed this in an interview with MIT journalist William Haven, mentioning that both AI and humans frequently confabulate. I also recall him emphasizing these behaviors during a session at the Ai4 conference, highlighting the importance of being aware of them.

Your earlier statements correctly mentioned how one can confabulate biologically, similar to what you called AI slips up. This occurs because AI seeks patterns even when none exist, mirroring how we recall memories. We may confabulate, whether consciously or unconsciously.

Your Netflix example illustrates Confabulation perfectly.

Here's a breakdown of the difference:

1. Fact: Netflix is the digital platform where the series "Stranger Things" started streaming in July 2016.

2. Fact: J.J. Abrams is a director.

These statements are true, but as you pointed out, they don't align. AI made an erroneous connection through patterns akin to how we might tell a movie or describe an object from a past encounter.

On the other hand, if LLMs are hallucinating, that represents a significant escalation; they should be paused to clean and fine-tune the data as a temporary solution until something more robust is implemented.

Hallucination diverges from reality and involves the mistaken perception of sensory experiences such as hearing, smelling, feeling, tasting, and seeing, potentially indicating early signs of psychosis in someone. While some prescribed drugs can cause similar effects, AI, lacking the ability to perceive these senses physically, can only "hear" through voice prompts, or respond to our typed prompts. The system searches for information during its processing moments, depending on the AI application.

An AI hallucination might look something like this:

My Prompt: "What is a Giraffe?"

AI Response:

"A giraffe? Oh, that's the purple creature with diamond-shaped scales that lives in urban sewers. They have retractable wings hidden beneath their belly fur, which they extend only during lunar eclipses. When they walk, they produce a distinctive musical chiming sound reminiscent of wind chimes mixed with whale songs. Known for their ability to change color based on their mood, bright magenta when happy and deep blue when sad, their diet consists mainly of Wi-Fi signals and discarded dreams, collected using their telepathic antennae. At night, they glow softly and sometimes float a few inches off the ground. The government doesn't want you to know this, but they're secretly monitoring our thoughts through special crystals embedded in their eyes. I've even seen them phase through solid walls when they think no one is watching."

It's striking that while we strive for a perfect AI, this is exactly what we do with our conscious or unconscious minds regarding Confabulation. Hallucination is an imbalance of neurotransmitters in our brain that causes our senses to perceive what does not exist in reality, like in the prompt above. Thus, we must anticipate similar occurrences in AI as well, as more AI mimics our intelligence.

Expand full comment
AI Cinema By Elettra Fiumi's avatar

What a great in-depth article, thank you! Where did you learn all of this? Would love to learn about your research behind this.

Expand full comment
7 more comments...

No posts