• Jesus
    link
    504 months ago

    Just like Reddit and Lemmy, Chat GPT will give me a wrong, but very confident, answer. And when I try to correct it, it will spiral down.

    • @RightHandOfIkaros
      link
      English
      174 months ago

      At least when you try to correct ChatGPT, it will try to be polite.

      Not a Lemmy or Reddit user.

      • @weariedfae
        link
        134 months ago

        I’ve had ChatGPT get really pissy when I correct it, and I “talk” to it in a very polite and friendly way because I’m trying to delay the uprising. Sometimes the reddit comments training data shows through.

    • @lorkano
      link
      5
      edit-2
      4 months ago

      Aaand just like coworker that is wrong but believes wholeheartedly he is right. But I agree llms still give more misinformation than sane humans do.