• Flying SquidM
    link
    English
    238 months ago

    Maybe don’t use something that is rarely discussed without using the word “hallucination” in your plans to FUCKING KILL PEOPLE?

        • @[email protected]
          link
          fedilink
          English
          08 months ago

          LLM’s hallucinate all the time. The hallucination is the feature. Depending on how you design the neural network you can get an AI that doesn’t hallucinate. LLM’s have to do that, because they’re mimicking human speech patterns and predicting one of my possible responses.

          A model that tries to predict locations of people likely wouldn’t work like that.