• @[email protected]
    link
    fedilink
    982 months ago

    Testing armed robot dogs in the Middle East instead of the US is pretty telling.

    Can’t be accidentally murdering Americans with a software glitch.

    • @_stranger_
      link
      562 months ago

      Really has a strong “testing in production” vibe

          • @beebarfbadger
            link
            4
            edit-2
            2 months ago

            Don’t worry, no danger of killing real people in the Middle East. All the “collateral damage” will be brown people, not Americans. They’ll have all the kinks ironed out and will make sure that the AI doesn’t hurt white targets before the technology is distributed to every national police district.

            I wish this post even deserved a /s.

    • @[email protected]
      link
      fedilink
      142 months ago

      Which is wild when you add perspective using facts like the police in the US are less disciplined than troops overseas and tbe US still uses substances banned by the Geneva Convention on its civilian population. So if even the US wwon’t test it on their own people, it’s bad.

      • Jojo, Lady of the West
        link
        fedilink
        82 months ago

        Listen, the Geneva convention only specifies what we can’t use on enemies, okay? As long as the targets are technically friendlies, it’s fair game!

        • @PapstJL4U
          link
          English
          42 months ago

          GC is for war and soldiers are combatants and not criminals by default (switching can happen easily). As an example Hollowpoint against criminals is okay as it can protect surrounding bystanders.

          It’s a bit weird, but for countries war is different from domestic problems.