• @HoofHearted
    link
    English
    05 days ago

    The terrifying thing is everyone criticising the LLM as being poor, however it excelled at the task.

    The question asked was how many R in strawbery and it answered. 2.

    It also detected the typo and offered the correct spelling.

    What’s the issue I’m missing?

    • Tywèle [she|her]
      link
      fedilink
      English
      205 days ago

      The issue that you are missing is that the AI answered that there is 1 ‘r’ in ‘strawbery’ even though there are 2 'r’s in the misspelled word. And the AI corrected the user with the correct spelling of the word ‘strawberry’ only to tell the user that there are 2 'r’s in that word even though there are 3.

      • TomAwsm
        link
        English
        -15 days ago

        Sure, but for what purpose would you ever ask about the total number of a specific letter in a word? This isn’t the gotcha that so many think it is. The LLM answers like it does because it makes perfect sense for someone to ask if a word is spelled with a single or double “r”.

        • @jj4211
          link
          English
          14 days ago

          Except many many experts have said this is not why it happens. It cannot count letters in the incoming words. It doesn’t even know what “words” are. It has abstracted tokens by the time it’s being run through the model.

          It’s more like you don’t know the word strawberry, and instead you see: How many 'r’s in 🍓?

          And you respond with nonsense, because the relation between ‘r’ and 🍓 is nonsensical.

        • snooggums
          link
          English
          15 days ago

          It makes perfect sense if you do mental acrobatics to explain why a wrong answer is actually correct.

          • TomAwsm
            link
            English
            05 days ago

            Not mental acrobatics, just common sense.

    • Fubarberry
      link
      fedilink
      English
      35 days ago

      There’s also a “r” in the first half of the word, “straw”, so it was completely skipping over that r and just focusing on the r’s in the word “berry”

      • @jj4211
        link
        English
        24 days ago

        It doesn’t see “strawberry” or “straw” or “berry”. It’s closer to think of it as seeing 🍓, an abstract token representing the same concept that the training data associated with the word.

      • @[email protected]
        link
        fedilink
        English
        35 days ago

        It wasn’t focusing on anything. It was generating text per its training data. There’s no logical thought process whatsoever.