` `

Israel Uses AI to Pick Bombing Targets in Gaza

Wesam Abo Marq Wesam Abo Marq
News
8th April 2024
Israel Uses AI to Pick Bombing Targets in Gaza
Israel used AI to identify potential bombing targets in Gaza (Getty)

According to an investigation conducted by +972 Magazine and Local Call, Israeli military officials have utilized artificial intelligence technology to identify potential bombing targets in Gaza. The report, citing six Israeli intelligence sources linked to the program, suggests that the process of human review for these targets was cursory at best.

Israel Deploys AI to Identify Targets in Gaza

The Israeli military's bombing campaign in Gaza reportedly relied on a covert AI-powered database that identified around 37,000 potential targets, according to intelligence sources familiar with the ongoing war. 

Lavender, an AI-powered database, was developed by the IOF's elite intelligence division, Unit 8200, which is akin to the United States' National Security Agency or the U.K.'s Government Communications Headquarters (GCHQ).

The officials, as quoted in a comprehensive investigation by the online publication jointly operated by Palestinians and Israelis, disclosed that the AI-driven tool in question was named "Lavender" and was acknowledged to possess a 10% error margin.

A supporting image within the article body
A screenshot of the +972 article.

These sources also assert that Israeli military officials may have allowed significant numbers of Palestinian civilians to be killed, particularly in the initial stages of the war. Their accounts offer a rare inside look into the experiences of Israeli intelligence personnel using machine-learning technology, such as the AI system known as Lavender, to identify targets during the six-month-long war. 

Israel's utilization of advanced AI systems in its conflict has raised significant legal and moral questions, reshaping the dynamic between military personnel and technology on the battlefield.

An intelligence officer who utilized Lavender remarked, "This is unparalleled, in my memory." They added that they had more confidence in a "statistical mechanism" than in a soldier mourning loss, stating, "Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier."

A supporting image within the article body
A screenshot of the Guardian’s article.

Another Lavender user pondered the significance of humans' role in the selection process, stating, "I would invest 20 seconds for each target at this stage, and do dozens of them every day. I had zero added-value as a human, apart from being a stamp of approval. It saved a lot of time.'"

Israel Systematically Attack Targets in Gaza at Night

The +972 Magazine further reported that the Israeli army "systematically attacked" individuals in their homes, often at night when entire families were present. According to the sources, this led to the deaths of thousands of Palestinians, primarily women, children, or non-combatants, due to Israeli airstrikes, particularly in the initial weeks of the conflict, influenced by the decisions of the AI program.

The report, citing sources, stated that when targeting alleged junior militants, the army "preferred" to utilize so-called dumb bombs, unguided missiles capable of causing extensive damage.

Israel Does Not Deny AI Use in Gaza Bombing Targets

When questioned regarding +972 Magazine's findings, the IOF did not contest the existence of the tool but refuted the use of AI for identifying suspected terrorists. However, in a detailed statement, the IOF underscored that "information systems serve as aids for analysts in the target identification process," and highlighted Israel's efforts to "minimize harm to civilians as much as possible within the operational circumstances prevailing at the time of the strike."

A supporting image within the article body
A screenshot of the Guardian’s article.

The IOF emphasized that "analysts must conduct independent examinations, in which they verify that the identified targets meet the relevant definitions in accordance with international law and additional restrictions stipulated in the IOF directives."

Nevertheless, as per one official's account to +972 Magazine, human personnel often merely served as a "rubber stamp" for the machine's decisions, typically spending only about 20 seconds per target—ensuring they are male—before authorizing a bombing.

The Guardian Exposes Identity of Israel's AI System Commander

The Guardian has uncovered the identity of the commander of Israel’s Unit 8200, a highly secretive figure responsible for leading one of the world’s most powerful surveillance agencies. In an exclusive article, the outlet revealed that the head of Unit 8200, also known as the architect of the Lavender AI system, is Yossi Sariel.

A supporting image within the article body
A screenshot of the Guardian’s article.

Sariel's identity, which has been closely guarded for over two decades, was inadvertently exposed due to a security lapse related to a book he published on Amazon. The digital trail left by the book led to a private Google account created in Sariel's name, revealing his unique ID and links to the account’s maps and calendar profiles.

The Guardian has independently verified Sariel's connection to the book, titled "The Human Machine Team," in which he presents a groundbreaking vision for the integration of artificial intelligence into military operations, transforming the dynamic between military personnel and machines.

U.N. Express Concerns Over Israel's Use of AI in Bombing Targets in Gaza

The United States and the United Nations have both expressed concerns over reports suggesting that Israel has been utilizing artificial intelligence to identify bombing targets in Gaza.

White House national security spokesperson John Kirby informed CNN on Thursday that the U.S. government was investigating the matter following media reports.

U.N. Secretary-General Antonio Guterres conveyed serious apprehensions over the issue on Friday, emphasizing his deep concern regarding the use of artificial intelligence by Israel in targeting operations. Guterres highlighted the distressing possibility that the Israeli military's reliance on AI could lead to a significant number of civilian casualties, particularly in densely populated residential areas.

A supporting image within the article body
A screenshot of the France 24’s article.

"No part of life and death decisions which impact entire families should be delegated to the cold calculation of algorithms," he said.

Israel Killed Foreign Aid Workers in Gaza

The investigation emerges against the backdrop of heightened international attention on Israel's military operations, following targeted airstrikes that resulted in the deaths of several foreign aid workers from World Central Kitchen while delivering food in Gaza

According to the Gaza Ministry of Health, Israel's siege of Gaza has resulted in the deaths of more than 33,137 people and has precipitated a severe humanitarian crisis. In addition, a United Nations-backed report indicates that nearly three-quarters of the population in northern Gaza is grappling with catastrophic levels of hunger.

Read More

Visual Investigation Points to Israel, Not Hamas, Responsibility in Killing the WCK Aid Workers

Misbar Confirms Israel's Accountability for Killing Four Unarmed Palestinian Civilians

Most Read