• GigglyBobble
    link
    fedilink
    1010 months ago

    Unless the operator decides hitting exactly those targets fits their strategy and they can blame a software bug.

    • FaceDeer
      link
      fedilink
      -710 months ago

      And then when they go looking for that bug and find the logs showing that the operator overrode the safeties instead, they know exactly who is responsible for blowing up those ambulances.

      • GigglyBobble
        link
        fedilink
        11
        edit-2
        10 months ago

        And if the operator was commanded to do it? And to delete the logs? How naive are you that this is somehow makes war more humane?

        • FaceDeer
          link
          fedilink
          010 months ago

          Each additional safeguard makes it harder and adds another name to the eventual war crimes trial. Don’t let the perfect be the enemy of the good, especially when it comes to reducing the number of ambulances that get blown up in war zones.

      • mihies
        link
        fedilink
        510 months ago

        It doesn’t work like that though. Western (backed) military can do and does that unpunished.

      • Flying Squid
        link
        English
        310 months ago

        Israeli general: Captain, were you responsible for reprogramming the drones to bomb those ambulances?

        Israeli captain: Yes, sir! Sorry, sir!

        Israeli general: Captain, you’re just the sort of man we need in this army.

        • FaceDeer
          link
          fedilink
          0
          edit-2
          10 months ago

          Ah, evil people exist and therefore we should never develop technology that evil people could use for evil. Right.

          • Flying Squid
            link
            English
            310 months ago

            Seems like a good reason not to develop technology to me. See also: biological weapons.

            • FaceDeer
              link
              fedilink
              010 months ago

              Those weapons come out of developments in medicine. Technology itself is not good or evil, it can be used for good or for evil. If you decide not to develop technology you’re depriving the good of it as well. My point earlier is to show that there are good uses for these things.

              • Flying Squid
                link
                English
                310 months ago

                Hmm… so maybe we keep developing medicine but not as a weapon and we keep developing AI but not as a weapon.

                Or can you explain why one should be restricted from weapons development and not the other?

              • livus
                link
                fedilink
                110 months ago

                I disagree with your premise here. Taking a life is a serious step. A machine that unilaterally decides to kill some people with no recourse to human input has no good application.

                It’s like inventing a new biological weapon.

                By not creating it, you are not depriving any decent person of anything that is actually good.