` `

Investigation Uncovers Decade-Long Use of Israel's Secretive AI System in Gaza Attacks

Wesam Abo Marq Wesam Abo Marq
Politics
8th January 2025
Investigation Uncovers Decade-Long Use of Israel's Secretive AI System in Gaza Attacks
Internal critics say the AI program secretly accelerated Gaza's death toll

According to The Washington Post, Israel spent over a decade building its “AI factory,” a system that actively supported military attacks in Gaza since October 7, 2023. This AI-driven system enabled the military to replenish its target bank rapidly and maintain an uninterrupted campaign of assaults.

Israel Relies on “AI Factory” in Gaza Attacks

The Washington Post revealed that Israel spent over ten years developing its “AI factory” to support its military assaults in Gaza.

Israel Relies on “AI Factory” in Gaza Attacks
A screenshot of The Washington Post’s article

The IOF employed an advanced artificial intelligence tool called Habsora, or “the Gospel,” to quickly generate hundreds of additional targets. This system enabled the military to maintain its target campaign without interruption. 

The Post’s investigation uncovered previously undisclosed details about the machine-learning program's inner workings and the secretive, decade-long process behind its development. 

It also exposed a heated debate within the military’s upper ranks, predating October 7, over the reliability of AI-gathered intelligence, the level of scrutiny applied to the technology’s recommendations, and concerns that reliance on AI may have weakened the IOF's broader intelligence capabilities.

Internal critics argue that the AI program played a hidden role in accelerating Gaza's death toll amid the ongoing genocide.

“What's happening in Gaza is a forerunner of a broader shift in how war is being fought,” said Steven Feldstein, senior fellow at the Carnegie Endowment, who researches the use of AI in war. “Combine that with the acceleration these systems offer — as well as the questions of accuracy — and the end result is a higher death count than was previously imagined in war.”

Israel's AI Tools Have Critical Flaws

The technology became popular in 2020 under Yossi Sariel, the leader of Israel's Intelligence Unit 8200.

Sariel led the development of the Gospel, a machine-learning software powered by hundreds of predictive algorithms, enabling soldiers to swiftly query a vast data repository referred to within the military as “the pool.”

Another tool, called Lavender, assigns a percentage score to predict the likelihood of a Palestinian being a member of a resistance group. Additional algorithmic programs include Alchemist, Depth of Wisdom, Hunter, and Flow.
Several officers within the division expressed concerns that the rapid decision-making enabled by machine-learning technology masked critical flaws.

Reports presented to senior leadership often failed to specify whether intelligence came from human analysts or AI systems, complicating the evaluation process, according to a former senior military official.

According to two former senior military leaders, an internal audit revealed that some AI systems used for processing the Arabic language contained inaccuracies and struggled to interpret key slang words and phrases, and that the software’s predictions were less accurate than those of a human officer.

Others feared that the software’s predictions were receiving undue importance. Typically, the research division would produce daily intelligence reports for senior commanders to assess potential targets. 

However, while an individual analyst could access the information behind the prediction, senior commanders did not know whether the recommendation relied on an algorithm or human sourcing.

Unit 8200 Removes Several Anti-AI Israeli Leaders

The Washington Post's investigation, citing three sources, revealed that under Sariel’s leadership and other intelligence commanders, Unit 8200 was restructured to prioritize engineers. This shift included cutting Arabic-language specialists, removing leaders resistant to AI, and disbanding groups not focused on data-mining technology.

By October 7, 60 percent of the unit's personnel worked in engineering and tech roles, double the proportion from a decade earlier, according to one source.

Two former senior commanders stated that they believe Israel's intense focus on AI significantly contributed to its failure on October 7. “This was an AI factory,” said one former military leader. “The man was replaced by the machine.” 

Former commanders convened to express concerns about the “religious attitude toward AI” that emerged within the unit during Sariel’s tenure, according to two sources.

Sariel resigned from the IOF in September amid growing scrutiny over the intelligence failures that contributed to the October 7 operation.

Unit 8200 Removes Several Anti-AI Israeli Leaders
A screenshot of The Times of Israel’s article

Israel Uses AI Software to Estimate Civilian Casualties

Israeli soldiers were instructed to use a software program to estimate civilian casualties for a bombing campaign targeting about 50 buildings in northern Gaza.

The analysts received a simple formula: divide the number of people in a district by the estimated population, with the former figure derived by counting cellphones connected to a nearby cell tower.

The system used a red-yellow-green traffic light to indicate people inside a building. If the rate was 25 percent or less, it would flash green, signaling that the building could be passed to a commander to decide whether to bomb.

The system ignored factors like cellphones being turned off or out of battery and the fact that children, who would not have a cellphone, were not accounted for.

Read More

Israel Uses AI to Pick Bombing Targets in Gaza

AI Researchers: Israel Targets U.S. and Canada with Influence and Misinformation Campaigns on Gaza War