• MentalEdge
    link
    fedilink
    English
    20
    edit-2
    8 months ago

    There’s also the fact that they can’t tell reality apart from fiction in general, because they don’t understand anything in the first place.

    LLMs have no way of differentiating fantasy RPG elements from IRL things. So they can lose the plot on what is being discussed suddenly, and for seemingly no reason.

    LLMs don’t just “learn” facts from their training data. They learn how to pretend to be thinking, they can mimic but not really comprehend. If there were facts in the training data, it can regurgitate them, but it doesn’t actually know which facts apply to which subjects, or when to not make some up.

    • @Buffalox
      link
      English
      98 months ago

      They learn how to pretend

      True, and they are so darn good at it, that it can be somewhat confusing at times.
      But the current AIs are not the ones we read about in SciFi.

      • @SpaceNoodle
        link
        English
        78 months ago

        I’d argue that referring to it as “AI” is a stretch since it’s all A and no I.