• @SpaceNoodle
    link
    English
    48 months ago

    Well, there’s no understanding or reasoning behind ChatGPT, so …

    • @Cort
      link
      English
      18 months ago

      Is it explaining its reasoning or fabricating a plausible justification for the outputs? We’ll never know

    • @[email protected]
      link
      fedilink
      English
      0
      edit-2
      8 months ago

      Depends a bit on perspective and nuance. Gpt4 pretty much always returns text relevant to the prompt, the neural net sees A and knows B comes next. thats a form of understanding. Not understanding would be like its incapable of seeing A and outputting something irrelevant.

      For reasoning, which i believe is actually “step by step” logic it needs a good handholding Prompt but then it can consistently create grade school level of solutions to logical problems.

      Neither is what humans would call true understanding and true reasoning but its way to early judge ai by human standards.