• @[email protected]
    link
    fedilink
    English
    85 months ago

    That’s assuming the CEO isn’t already hallucinating.

    At least when an LLM hallucinates you can tell it and it won’t fire you.

    • @[email protected]
      link
      fedilink
      English
      45 months ago

      It doesn’t have the power to do so. But it does have the power to shrug off your questions. Has an LLM ever shrugged off your questions?

      • @[email protected]
        link
        fedilink
        English
        3
        edit-2
        5 months ago

        Sort of, I had GitHub Copilot hallucinate an AWS Cloud formation template stanza.

        Asked it for the source it used for the stanza, which it then gave me the URL for.

        Told it that the crap it just gave me wasn’t on that page.

        It apologies and told me to RTFM.

        So, yeah, even super auto correct is a dick.