• @[email protected]
    link
    fedilink
    English
    410 months ago

    they are both shit at adding and subtracting numbers, dates and whatnot… they both cant do basic math unfortunately

    • danielbln
      link
      510 months ago

      It’s a language model, I don’t know why you would expect math. Tell it to output code to perform the math, that’ll work just fine.

        • danielbln
          link
          110 months ago

          I just asked GPT-4:

          What’s 7 * 8 divided by 10, to the power of 3?

          Its reply:

          Let’s break this down step by step:

          First, multiply 7 and 8 to get 56.

          Then, divide 56 by 10 to get 5.6.

          Finally, raise 5.6 to the power of 3 (5.6 * 5.6 * 5.6) to get 175.616.

          So, 7 * 8 divided by 10, to the power of 3 equals 175.616

          • @[email protected]
            link
            fedilink
            English
            110 months ago

            It’s pretty hit or miss though… I’ve had lots of good calculations with the odd wrong one sprinkled in, making it unreliable for doing maths. Mostly because it presents the result with absolute certainty.

        • @[email protected]
          link
          fedilink
          English
          110 months ago

          It’s not baffling at all… It’s a language model, not a math robot. It’s designed to write English sentences, not to solve math problems.

      • @[email protected]
        link
        fedilink
        110 months ago

        Then it should say so instead of attempting and failing at the one thing computers are supposed to be better than us at

        • danielbln
          link
          1
          edit-2
          10 months ago

          Well, if I try to use Photoshop to calculate a polynomial it’s not gonna work all that well either, right tool for the job and all.

          The fact that LLMs are terrible at knowing what they don’t know should be well known by now (ironically).

          • @[email protected]
            link
            fedilink
            110 months ago

            And if Photoshop had a way to ask it for such, it’d be a mistake.

            Gpt thinking it knows something and hallucinating is ultimatelya bug, not a feature, no matter what the apologists say