• @pulsewidth
    link
    English
    314 hours ago

    A hallucination is a false perception of sensory experiences (sights, sounds, etc).

    LLMs don’t have any senses, they have input, algorithms and output. They also have desired output and undesired output.

    So, no, ‘hallucinations’ fits far worse than failure or error or bad output. However assigning the term ‘hallucinaton’ does serve the billionaires in marketing their LLMs as actual sentience.