• @[email protected]
    link
    fedilink
    English
    637 months ago

    Some “AI” LLMs resort to light hallucinations. And then ones like this straight-up gaslight you!

    • @eatCasserole
      link
      507 months ago

      Factual accuracy in LLMs is “an area of active research”, i.e. they haven’t the foggiest how to make them stop spouting nonsense.

      • @[email protected]
        link
        fedilink
        287 months ago

        duckduckgo figured this out quite a while ago: just fucking summarize wikipedia articles and link to the precise section it lifted text from

      • @[email protected]
        link
        fedilink
        English
        12
        edit-2
        7 months ago

        Because accuracy requires that you make a reasonable distinction between truth and fiction, and that requires context, meaning, understanding. Hell, full humans aren’t that great at this task. This isn’t a small problem, I don’t think you solve it without creating AGI.

    • @Jimmyeatsausage
      link
      77 months ago

      MFer accidentally got “plum” right and didn’t even know it…