• @[email protected]
    link
    fedilink
    205 days ago

    Not even that. LLMs have no concept of meaning or understanding. What they do in essence is space filling based on previously trained patterns.

    Like showing a bunch of shapes to someone, then drawing a few lines and asking them to complete the shape. And all the shapes are lamp posts but you haven’t told them that and they have no idea what a lamp post is. They will just produce results like the shapes you’ve shown them, which generally end up looking like lamp posts.

    Except the “shape” in this case is a sentence or poem or self insert erotic fan fiction, none of which an LLM “understands”, it just matches the shape of what’s been written so far with previous patterns and extrapolates.

    • NaibofTabr
      link
      fedilink
      English
      35 days ago

      Well yes… I think that’s essentially what I’m saying.

      It’s debatable whether our own brains really operate any differently. For instance, if I say the word “lamppost”, your brain determines the meaning of that word based on the context of my other words around “lamppost” and also all of your past experiences that are connected with that word - because we use past context to interpret present experience.

      In an abstract, nontechnical way, training a machine learning model on a corpus of data is sort of like trying to give it enough context to interpret new inputs in an expected/useful way. In the case of LLMs, it’s an attempt to link the use of words and phrases with contextual meanings so that a computer system can interact with natural human language (rather than specifically prepared and formatted language like programming).

      It’s all just statistics though. The interpretation is based on ingestion of lots of contextual uses. It can’t really understand… it has nothing to understand with. All it can do is associate newly input words with generalized contextual meanings based on probabilities.

      • @MutilationWave
        link
        05 days ago

        I wish you’d talked more about how we humans work. We are at the mercy of pattern recognition. Even when we try not to be.

        When “you” decide to pick up an apple it’s about to be in your hand by the time your software has caught up with the hardware. Then your brain tells “you” a story about why you picked up the apple.

        • @IlovePizza
          link
          25 days ago

          I really don’t think that is always true. You should see me going back and forth in the kitchen trying to decide what to eat 😅