• FaceDeer
    link
    fedilink
    -31 year ago

    If you program an AI drone to recognize ambulances and medics and forbid them from blowing them up, then you can be sure that they will never intentionally blow them up. That alone makes them superior to having a Mk. I Human holding the trigger, IMO.

    • GigglyBobble
      link
      fedilink
      101 year ago

      Unless the operator decides hitting exactly those targets fits their strategy and they can blame a software bug.

      • FaceDeer
        link
        fedilink
        -71 year ago

        And then when they go looking for that bug and find the logs showing that the operator overrode the safeties instead, they know exactly who is responsible for blowing up those ambulances.

        • GigglyBobble
          link
          fedilink
          11
          edit-2
          1 year ago

          And if the operator was commanded to do it? And to delete the logs? How naive are you that this is somehow makes war more humane?

          • FaceDeer
            link
            fedilink
            01 year ago

            Each additional safeguard makes it harder and adds another name to the eventual war crimes trial. Don’t let the perfect be the enemy of the good, especially when it comes to reducing the number of ambulances that get blown up in war zones.

        • mihies
          link
          fedilink
          51 year ago

          It doesn’t work like that though. Western (backed) military can do and does that unpunished.

        • Flying Squid
          link
          English
          31 year ago

          Israeli general: Captain, were you responsible for reprogramming the drones to bomb those ambulances?

          Israeli captain: Yes, sir! Sorry, sir!

          Israeli general: Captain, you’re just the sort of man we need in this army.

          • FaceDeer
            link
            fedilink
            0
            edit-2
            1 year ago

            Ah, evil people exist and therefore we should never develop technology that evil people could use for evil. Right.

            • Flying Squid
              link
              English
              31 year ago

              Seems like a good reason not to develop technology to me. See also: biological weapons.

              • FaceDeer
                link
                fedilink
                01 year ago

                Those weapons come out of developments in medicine. Technology itself is not good or evil, it can be used for good or for evil. If you decide not to develop technology you’re depriving the good of it as well. My point earlier is to show that there are good uses for these things.

                • Flying Squid
                  link
                  English
                  31 year ago

                  Hmm… so maybe we keep developing medicine but not as a weapon and we keep developing AI but not as a weapon.

                  Or can you explain why one should be restricted from weapons development and not the other?

                • livus
                  link
                  fedilink
                  11 year ago

                  I disagree with your premise here. Taking a life is a serious step. A machine that unilaterally decides to kill some people with no recourse to human input has no good application.

                  It’s like inventing a new biological weapon.

                  By not creating it, you are not depriving any decent person of anything that is actually good.

    • Chuck
      link
      fedilink
      English
      61 year ago

      It’s more like we’re giving the machine more opportunities to go off accidentally or potentially encouraging more use of civilian camouflage to try and evade our hunter killer drones.

    • @kromem
      link
      English
      31 year ago

      Right, because self-driving cars have been great at correctly identifying things.

      And those LLMs have been following their rules to the letter.

      We really need to let go of our projected concepts of AI in the face of what’s actually been arriving. And one of those things we need to let go of is the concept of immutable rule following and accuracy.

      In any real world deployment of killer drones, there’s going to be an acceptable false positive rate that’s been signed off on.

      • FaceDeer
        link
        fedilink
        11 year ago

        We are talking about developing technology, not existing tech.

        And actually, machines have become quite adept at image recognition. For some things they’re already better at it than we are.

    • @crypticthree
      link
      English
      31 year ago

      Did you know that “if” is the middle word of life