6-4-2024 (GAZA) Amidst the ongoing violence in Gaza, a chilling report has shed light on the Israeli military’s use of an artificial intelligence-fueled targeting system known as “Lavender.” According to the report, published Wednesday by the Israeli-Palestinian outlet +972 and the website Local Call, this system has been employed for months to select bombing targets with minimal human oversight, raising grave concerns over the potential for civilian casualties.
The report, based on unnamed sources and documents, alleges that military personnel approved the AI-selected targets with what one source described as a mere “rubber stamp,” often taking only around “20 seconds” to review each target before approving a bombing. Alarmingly, a study of a random sample identified a 10% error rate in the program’s designations when it targeted individuals who were not militants. Despite this error rate, sources claim they received approval around two weeks into the current conflict to automatically adopt Lavender’s “kill lists.”
Furthermore, the military reportedly pursued targeted individuals at home, often with family members present, through another program ominously dubbed “Where’s Daddy?” Due to lags in this program, families were allegedly killed at home even when the main target was not present, according to the report.
One unnamed senior officer, referred to as “B,” provided a chilling account: “At 5 a.m., [the air force] would come and bomb all the houses that we had marked. We took out thousands of people. We didn’t go through them one by one — we put everything into automated systems, and as soon as one of [the marked individuals] was at home, he immediately became a target. We bombed him and his house.”
The report’s sources claim that the outcome of this program has been tens of thousands of targets and thousands of Palestinian civilian casualties, including women, children, and non-combatants.
The revelations underscore global concerns over the high rate of civilian casualties in Gaza, where some 33,000 people have died in Israel’s military campaign since October 7th, when Palestinian militants attacked Israel, killing around 1,200 people and taking 240 hostages. Israel has since pursued a months long bombing campaign and invasion in retaliation.
The report has added fuel to international outrage over a series of recent Israeli strikes, including an attack that killed seven food aid volunteers with World Central Kitchen on Tuesday. The aid group’s leader believes the strike was intentional. Additionally, reports suggest that Gaza is heading toward an unprecedented famine as access to food and basic necessities has been limited, in part due to tight Israeli controls at border crossings.
In response to these concerns, US President Joe Biden called for an immediate temporary ceasefire during a Thursday call with Israeli Prime Minister Benjamin Netanyahu. Secretary of State Antony Blinken also emphasized the need for Israel to take concrete steps to address civilian harm, humanitarian suffering, and the safety of aid workers, stating that US policy on Gaza would depend on those steps.
While Israel dismissed two officers after a swift investigation and pledged to open more aid routes to Gaza, the situation on the ground remains dire. Reports indicate the destruction of Gaza’s largest hospital, Al-Shifa, after an Israeli siege, and heavy casualties, including militants, patients, doctors, and hospital workers. Israel has denied any civilian casualties.
The Israeli military spokesperson disputed the assertion that the military used an artificial intelligence system to identify militants, stating that “information systems are merely tools for analysts in the target identification process” subject to “independent examinations” to determine that targets meet “the relevant definitions in accordance with international law and additional restrictions stipulated in the IDF directives.” The statement claimed that the Israeli military did not carry out strikes where collateral damage was judged to be “excessive in relation to the military advantage” and rejected the claim regarding any policy to kill tens of thousands of people in their homes.
However, the report’s sources paint a different picture, with one unnamed officer in a target operation room claiming that the army’s international law department had not previously given approval for such extensive collateral damage. Another source alleged that the Israeli military judged it acceptable to kill up to 15 to 20 civilians for every junior Hamas operative targeted and, on occasion, more than 100 civilians for commanders.
The shocking civilian casualties, according to the report, stem in part from the use of unguided “dumb” bombs rather than precision strikes on junior militants targeted by the AI in the early weeks of the war. As one unnamed intelligence officer stated, “you don’t want to waste expensive bombs on unimportant people.”
The report’s author, Yuval Abraham, an Israeli journalist and filmmaker known for his call to end what he referred to as a system of “apartheid” in Israel and the Palestinian territories, had previously published a report in November detailing what an unnamed former intelligence officer called a “mass assassination factory,” a reference to AI-powered targeting decisions.
As the conflict rages on, the report’s revelations have reignited debates surrounding the responsible use of artificial intelligence in military operations, the protection of civilians, and the adherence to international humanitarian law.