• FaceDeer
    link
    fedilink
    297 months ago

    The problem with AI hallucinations is not that the AI was fed inaccurate information, it’s that it’s coming up with information that it wasn’t fed in the first place.

    As you say, this is a problem that humans have. But I’m not terribly surprised these AIs have it because they’re being built in mimicry of how aspects of the human mind works. And in some cases it’s desirable behaviour, for example when you’re using an AI as a creative assistant. You want it to come up with new stuff in those situations.

    It’s just something you need to keep in mind when coming up with applications.

    • @AdrianTheFrog
      link
      English
      47 months ago

      Not in the case of the google search AI. It quotes directly from unreliable sources.

      • FaceDeer
        link
        fedilink
        4
        edit-2
        7 months ago

        Exactly, which is why I’ve objected in the past to calling Google Overview’s mistakes “hallucinations.” The AI itself is performing correctly, it’s giving an accurate overview of the search result it’s being told to create an overview for. It’s just being fed incorrect information.