misk@sopuli.xyz to TechnologyEnglish · 2 years agoWe have to stop ignoring AI’s hallucination problemwww.theverge.comexternal-linkmessage-square204linkfedilinkarrow-up1532arrow-down129
arrow-up1503arrow-down1external-linkWe have to stop ignoring AI’s hallucination problemwww.theverge.commisk@sopuli.xyz to TechnologyEnglish · 2 years agomessage-square204linkfedilink
minus-squareUnsavoryMollusklinkfedilinkEnglisharrow-up2·2 years agoThey are right though. LLM at their core are just about determining what is statistically the most probable to spit out.
minus-squareCyberflunkdeleted by creatorlinkfedilinkEnglisharrow-up1arrow-down1·edit-22 months agodeleted by creator
They are right though. LLM at their core are just about determining what is statistically the most probable to spit out.
deleted by creator