@[email protected] to TechnologyEnglish • 6 months agoWe have to stop ignoring AI’s hallucination problemwww.theverge.comexternal-linkmessage-square208fedilinkarrow-up1532arrow-down129
arrow-up1503arrow-down1external-linkWe have to stop ignoring AI’s hallucination problemwww.theverge.com@[email protected] to TechnologyEnglish • 6 months agomessage-square208fedilink
minus-square@AceticonlinkEnglish5•6 months agoHadn’t heard about it before (or maybe I did but never looked into it), so I just went and found it in Wikipedia and will be reading all about it. So thanks for the info!
minus-square@feedum_sneedsonlinkEnglish6•6 months agoNo worries. The person above did a good job explaining it although they kind of mashed it together with the imagery from Plato’s allegory of the cave.
That’s an excellent methaphor for LLMs.
It’s the Chinese room thought experiment.
Hadn’t heard about it before (or maybe I did but never looked into it), so I just went and found it in Wikipedia and will be reading all about it.
So thanks for the info!
No worries. The person above did a good job explaining it although they kind of mashed it together with the imagery from Plato’s allegory of the cave.