• @[email protected]
    link
    fedilink
    English
    762 days ago

    It is possible for AI to hallucinate elements that don’t work, at least for now. This requires some level of human oversight.

    So, the same as LLMs and they got lucky.

    • @ATDA
      link
      English
      26 hours ago

      It’s like putting a million monkeys in a writers’ room, but super charged on meth and consuming insane resources.