Israel using 'Lavender' AI machine to choose targets in Gaza killing spree

Israel using 'Lavender' AI machine to choose targets in Gaza killing spree
An investigation has revealed that Israel’s military used an AI-based tool known to have a 10 percent error rate to identify targets in Gaza
3 min read
London
04 April, 2024
An investigation has revealed that Israel used a AI tool to identify around 37,000 potential targets based om their apparent links to Hamas [GETTY]

The Israeli military has been using an artificial intelligence (AI) tool known as 'Lavender', to identify targets across Gaza and direct their bombing campaign, responsible for many of the 33,000 Palestinians killed since October.

Lavender has played a "key role in the unprecedented bombing" of Palestinians, the investigation carried out by +972 Magazine and Local Call, stated and based on intelligence officers with first hand experience of the AI tool in the war on the Gaza Strip.

Sources said that 15 or 20 civilians could be killed in targeting low-level Hamas members, while the military authorised the killing of as many as 100 civilians to take out a Hamas commander, as happened with the flattening of an apartment block on 31 October.

The sources said the tool had a significant influence on the military's operations, to the point they treated the output of the AI machine "as if it were a human decision".

While the Lavender system is designed to mark suspects belonging to Hamas and the Palestinian Islamic Jihad (PIJ), in the first few weeks of the war the machine identified around 37,000 Palestinians, and their homes, as "suspects", which could be targeted in air strikes even if family members were at home.

'Systematically attacked'

The AI tool was known to have a 10 percent error rate but during the early stages of the war there was no requirement for the army to properly investigate how the machine selected its targets.

"The Israeli army systematically attacked the targeted individuals while they were in their homes – usually at night while their whole families were present – rather than during the course of military activity," the investigation states.

One of the intelligence officers who served in the war told +972 that it was "much easier" to bomb a family home than pick off the suspected militants when they were away from other civilians.

World
Live Story

 

"The IDF bombed them [Hamas operatives] in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations," the officer said.

When it came to targeting alleged junior militants marked by the Lavender system, the army preferred to instead use unguided missiles, known as "dumb" bombs, which can destroy entire neighbourhoods and cause significant casualties compared to more precision munitions.

The Israeli army has not denied the existence of the tool but claimed it was an information system used by analysts in the target identification process and that Israel tries to "reduce harm to civilians to the extent feasible in the operational circumstances ruling at the time of the strike".

When asked about the investigation, the Israeli army said that "analysts must conduct independent investigations, in which they verify that the identified targets meet the relevant definitions in accordance with international law and additional restrictions stipulated in the IDF directives".

The investigation comes at the same time as widespread international condemnation of Israel’s military campaign in Gaza, where Israeli forces have killed over 33,000 Palestinians since the start of the war on 7 October.

Earlier this week, Israeli air strikes killed seven foreign aid workers delivering food in Gaza with the World Central Kitchen, sparking outrage globally in what has been called a targeted killing.

Gaza has been plunged into a deep humanitarian crisis, with humanitarian organisations warning over "catastrophic levels of hunger" as humanitarian organisations suspend operations due to the killing of the aid workers.