• Big Tech is lying about some AI risks to shut down competition, a Google Brain cofounder has said.
  • Andrew Ng told The Australian Financial Review that tech leaders hoped to trigger strict regulation.
  • Some large tech companies didn’t want to compete with open source, he added.
  • xapr [he/him]
    link
    fedilink
    English
    361 year ago

    AI, climate change, and nuclear weapons proliferation

    One of those is not like the others. Nuclear weapons can wipe out humanity at any minute right now. Climate change has been starting the job of wiping out humanity for a while now. When and how is AI going to wipe out humanity?

    This is not a criticism directed at you, by the way. It’s just a frustration that I keep hearing about AI being a threat to humanity and it just sounds like a far-fetched idea. It almost seems like it’s being used as a way to distract away from much more critically pressing issues like the myriad of environmental issues that we are already deep into, not just climate change. I wonder who would want to distract from those? Oil companies would definitely be number 1 in the list of suspects.

    • P03 Locke
      link
      fedilink
      English
      25
      edit-2
      1 year ago

      Agreed. This kind of debate is about as pointless as declaring self-driving cars are coming out in 5 years. The tech is way too far behind right now, and it’s not useful to even talk about it until 50 years from now.

      For fuck’s sake, just because a chatbot can pretend it’s sentient doesn’t mean it actually is sentient.

      Some large tech companies didn’t want to compete with open source, he added.

      Here. Here’s the real lead. Google has been scared of AI open source because they can’t profit off of freely available tools. Now, they want to change the narrative, so that the government steps in regulates their competition. Of course, their highly-paid lobbyists will by right there to write plenty of loopholes and exceptions to make sure only the closed-source corpos come out on top.

      Fear. Uncertainty. Doubt. Oldest fucking trick in the book.

    • @Stovetop
      link
      English
      81 year ago

      When and how is AI going to wipe out humanity?

      With nuclear weapons and climate change.

      • @oDDmON
        link
        English
        21 year ago

        The two things experts said shouldn’t be done with AI, allow open internet access and teaching them to code, have been blithely ignored already. It’s just a matter of time.

    • @afraid_of_zombies
      link
      English
      31 year ago

      I don’t think the oil companies are behind these articles. That is very much a wheels within wheels type thinking that corporations don’t generally invest in. It is easier to just deny climate change instead of getting everyone distracted by something else.

      • xapr [he/him]
        link
        fedilink
        English
        11 year ago

        You’re probably right, but I just wonder where all this AI panic is coming from. There was a story on the Washington Post a few weeks back saying that millions are being invested into university groups that are studying the risks of AI. It just seems that something is afoot that doesn’t look like just a natural reaction or overreaction. Perhaps this story itself explains it: the Big Tech companies trying to tamp down competition from startups.

        • @afraid_of_zombies
          link
          English
          11 year ago

          It is coming from ratings and click based economy. Panic sells so they sell panic. No one is going to click an article titled “everything mostly fine”.

        • @[email protected]
          link
          fedilink
          English
          1
          edit-2
          1 year ago

          Because of dumb fucks using ChatGPT to do unethical and illegal shit, like fraudulently create works that mimic a writer and claim it’s that writer’s work to sell for cash, blatant copyright infringement, theft, cheating on homework and tests, all sorts of dumbassery.