Israeli forces are using an AI weapons system in Gaza co-produced by an Indian defence company that turns machine guns and assault rifles into computerised killing machines, Middle East Eye can reveal.

According to documents and news reports seen by MEE, Israeli forces have been using the Arbel weapons system in Gaza following their devastating invasion of the enclave after the 7 October attacks on southern Israel.

Touted as a “revolutionary game changer that improves operator lethality and survivability,” the Arbel system enhances machine guns and assault weapons - such as the Israeli-produced Tavor, Carmel and Negev - into a weapon that uses algorithms to boost soldiers chances of hitting targets more accurately and efficiently.

Although defence analysts say the weapon system may not be as cutting-edge or as widely used as the “Lavender” or “The Gospel” AI weapons systems - that are reported to have played a huge role in the tremendous death toll in Gaza - Arbel appears to be the first weapons system to directly tie India to Israel’s rapidly expanding AI war in Gaza in what could have wide-ranging implications for other conflicts.

  • Vanth
    link
    fedilink
    English
    1817 hours ago

    Am I reading the techno-babble accurately?

    You would muzzle sweep your target with the trigger pressed, and it would fire as your gun is actually aimed for the most-probable strike. If off target = it doesn’t fire = bullets saved and probably better targeting because the shooter isn’t dealing with as much recoil.

    So even less training needed and even further removed from human decision making. Soldiers didn’t murder that unarmed civilian, AI did.

    • @agelord
      link
      English
      412 hours ago

      The AI didn’t press the trigger, soldiers did.

    • @NeoNachtwaechter
      link
      English
      1016 hours ago

      Soldiers didn’t murder that unarmed civilian, AI did.

      Soldier did the (rough) aiming.
      Soldier pulled the trigger.

      Still hard to blame AI for it, don’t you think?

        • @ComradeMiao
          link
          English
          113 hours ago

          Who you’re responding to is speaking against the soldier.

    • @just_another_person
      link
      English
      616 hours ago

      Yes, you’re relying on an offline inference device to make trigger choices. Basically “if brown, shoot” from what I gather.