• @Sanctus
    link
    English
    561 year ago

    Why are the people in power always biased towards the most brain dead decisions? Does this not sound wrong to any of them?

    • @a4ng3l
      link
      151 year ago

      Risks of the AI going amok set aside there’s something appealing in not risking human lifes during conflicts. Karmically or economically wise. Also you remove the limitations of human bodies -suddenly your airframe can give 100% of its capabilities without turning the pilot in a milkshake.

      • @darthelmet
        link
        301 year ago

        Removing humans from your side of the war lowers the cost of going to war and allows for even more centralized power. It’s a lot easier to do morally bankrupt acts if you don’t need to convinced a group of human soldiers to do it. Clearly you can anyway a lot of the time, but going for the robots is a lot cheaper/less risky.

        It’s pretty obvious why powerful people would want this and why it would be terrible for the rest of us even without worrying about a hypothetical sky net future.

        • ivanafterall
          link
          fedilink
          61 year ago

          It’s a lot easier to murder with robots, but they’re going to need new models if they want to replace all the raping, pillaging, bureaucratic corruption, oppression, etc…

          • @a4ng3l
            link
            41 year ago

            There’s plenty of training material for those around I understand…

            • @[email protected]
              link
              fedilink
              11 year ago

              Definitely! Reliable discrimination of combatant and non combatant and chain of responsibility for that decision is much more important than just having a human in the loop.

    • @[email protected]
      link
      fedilink
      English
      7
      edit-2
      1 year ago

      It only sounds problematic to someone who cares about the lives of ordinary people. But that’s not the sort of person who dedicates their whole life to increasing their own wealth and power by any means necessary. Those are the people that get to make the decisions.

      • @Sanctus
        link
        English
        31 year ago

        Ah, my mistake.

    • @[email protected]
      link
      fedilink
      21 year ago

      I mean by broad definitions of AI, beyond visual range air to air missiles already do have AI making decisions to kill people.

      • FaceDeer
        link
        fedilink
        01 year ago

        And under even broader definitions of AI, there’s landmines. Those have been around for a long time already. Their “AI” is just monumentally stupid, I wouldn’t mind having them be a bit more discriminating about whose limb to blow off.

  • gregorum
    link
    fedilink
    English
    331 year ago

    wasn’t there a movie about this being a bad thing? i’m pretty sure there was…

  • @[email protected]
    link
    fedilink
    English
    191 year ago

    I look forward to them being equipped with the ability to refuel by consuming biomass, self replicating, and having unbreakable encryption so they can’t be stopped!

  • WasPentalive
    link
    fedilink
    141 year ago

    Do you want a Butlerian Jihad? Cuz this is how you get a Butlerian Jihad!

  • @buzz
    link
    13
    edit-2
    8 months ago

    Removed by mod

  • @[email protected]
    link
    fedilink
    4
    edit-2
    1 year ago

    Is the US preparing for a war with China? Cuz it certainly sounds like it.

    Quote from the article:

    The Pentagon is working toward deploying swarms of thousands of AI-enabled drones, according to a notice published earlier this year.

    In a speech in August, US Deputy Secretary of Defense, Kathleen Hicks, said technology like AI-controlled drone swarms would enable the US to offset China’s People’s Liberation Army’s (PLA) numerical advantage in weapons and people.

    “We’ll counter the PLA’s mass with mass of our own, but ours will be harder to plan for, harder to hit, harder to beat,” she said, reported Reuters.