Integrity Score 4462
No Records Found
No Records Found
No Records Found
Israel accused of using AI to target thousands in Gaza, as killer algorithms outpace international law
By Natasha Karner, RMIT University
The Israeli army used a new artificial intelligence (AI) system to generate lists of tens of thousands of human targets for potential airstrikes in Gaza, according to a report published last week. The report comes from the nonprofit outlet +972 Magazine, which is run by Israeli and Palestinian journalists.
The report cites interviews with six unnamed sources in Israeli intelligence. The sources claim the system, known as Lavender, was used with other AI systems to target and assassinate suspected militants – many in their own homes – causing large numbers of civilian casualties.
According to another report in the Guardian, based on the same sources as the +972 report, one intelligence officer said the system “made it easier” to carry out large numbers of strikes, because “the machine did it coldly”.
As militaries around the world race to use AI, these reports show us what it may look like: machine-speed warfare with limited accuracy and little human oversight, with a high cost for civilians
Military AI in Gaza is not new
The Israeli Defence Force denies many of the claims in these reports. In a statement to the Guardian, it said it “does not use an artificial intelligence system that identifies terrorist operatives”. It said Lavender is not an AI system but “simply a database whose purpose is to cross-reference intelligence sources”.
But in 2021, the Jerusalem Post reported an intelligence official saying Israel had just won its first “AI war” – an earlier conflict with Hamas – using a number of machine learning systems to sift through data and produce targets. In the same year a book called The Human–Machine Team, which outlined a vision of AI-powered warfare, was published under a pseudonym by an author recently revealed to be the head of a key Israeli clandestine intelligence unit.
Last year, another +972 report said Israel also uses an AI system called Habsora to identify potential militant buildings and facilities to bomb.
Read Full Story https://theconversation.com/israel-accused-of-using-ai-to-target-thousands-in-gaza-as-killer-algorithms-outpace-international-law-227453