db0 to [email protected]English • 8 months agoThe Google AI isn’t hallucinating about glue in pizza, it’s just over indexing an 11 year old Reddit post by a dude named fucksmith.message-square252fedilinkarrow-up1937arrow-down14file-text
arrow-up1933arrow-down1message-squareThe Google AI isn’t hallucinating about glue in pizza, it’s just over indexing an 11 year old Reddit post by a dude named fucksmith.db0 to [email protected]English • 8 months agomessage-square252fedilinkfile-text
minus-square@[email protected]linkfedilinkEnglish6•edit-28 months agoCouldn’t that describe 95% of what LLMs? It is a really good auto complete at the end of the day, just some times the auto complete gets it wrong
minus-square@[email protected]linkfedilinkEnglish3•8 months agoYes, nicely put! I suppose ‘hallucinating’ is a description of when, to the reader, it appears to state a fact but that fact doesn’t at all represent any fact from the training data.
Couldn’t that describe 95% of what LLMs?
It is a really good auto complete at the end of the day, just some times the auto complete gets it wrong
Yes, nicely put! I suppose ‘hallucinating’ is a description of when, to the reader, it appears to state a fact but that fact doesn’t at all represent any fact from the training data.