• @[email protected]
    link
    fedilink
    English
    2
    edit-2
    7 months ago

    Anything you learn from an LLM has a margin of error that makes it dangerous and harmful. It hallucinates documentation and fake facts like an asylum inmate. And it’s so expensive compared to just having real teachers that it’s all pointless. We’ve got humans, we don’t need more humans, adding labor doesn’t solve the problem with education.

    • @lanolinoil
      link
      English
      07 months ago

      bro I was taught in a textbook in the US in the 00s that the statue of liberty was painted green.

      No math teacher I ever had actually knew the level of math they were teaching.

      Humans hallucinate all the time. almost 1 billion children don’t even have access to a human teacher, thus the boon to humanity

      • @[email protected]
        link
        fedilink
        English
        3
        edit-2
        7 months ago

        Those textbooks and the people who regurgitate their contents are the training data for the LLM. Any statement you make about human incompetence is multiplied by an LLM. If they don’t have access to a human teacher then they probably don’t have PCs and AI subscriptions, either.

        • @lanolinoil
          link
          English
          07 months ago

          yeah but whatever the stats about as N increases alpha/beta error goes away thing is