The US Department of Defense has deployed machine learning algorithms to identify targets in over 85 air strikes on targets in Iraq and Syria this year.

The Pentagon has done this sort of thing since at least 2017 when it launched Project Maven, which sought suppliers capable of developing object recognition software for footage captured by drones. Google pulled out of the project when its own employees revolted against using AI for warfare, but other tech firms have been happy to help out.

  • @AbouBenAdhem
    link
    English
    4
    edit-2
    9 months ago

    The issue behind the Jevons effect isn’t that the technology in question doesn’t work as advertised—it’s that, by reducing the negative consequences associated with a decision, people become increasingly willing to make that decision until the aggregate negative consequences more than cancel out the effect of the “improvement”.

    • @[email protected]
      link
      fedilink
      29 months ago

      There’s really no reason to think this technology will be victim to the Jevons paradox. These strikes are already happening remotely, and if AI/ML can better discern targets vs civilians there’s absolutely no reason to think civilian casualties will increase because of it.

      That’s like saying using AI/ML to screen for cancer will result in more people dying from cancer.

      You’re trying to apply an economical theory about the consumption of finite resources to a completely unrelated field/sector.