• @mhague
    link
    English
    56 months ago

    I don’t get it, what makes the output trustworthy? If it seems real, it’s probably real? If it keeps hallucinating something, it must have some truth to it? Seems like the two main mindsets; you can tell by the way it is, and look it keeps saying this.

    • @Olgratin_Magmatoe
      link
      English
      196 months ago

      Given that multiple other commenters in the infosec.exchange thread have reproduced similar results, and right wingers tend to have bad security, and LLMs are pretty much impossible to fully control for now, it seems most likely that it’s real.

    • Natanael
      link
      fedilink
      English
      16 months ago

      It’s self delusion, nothing more. Broken logic