Israel using AI chatbot 'Genie' to carry out military operations in Gaza - report

Israel is using an AI chatbot, Genie, to decide who lives and who dies in Gaza's ongoing genocide.
4 min read
15 April, 2025
Last Update
16 April, 2025 08:23 AM
The Israeli army insists that Genie is still in a "trial phase" and doesn’t autonomously make decisions [Getty]

Israel has begun operational deployment of a military-grade artificial intelligence chatbot named Genie, a system now used by commanders to make battlefield decisions in Gaza.

Far from a neutral technological upgrade, Genie represents the accelerating use of AI in warfare - automating decisions that carry life-and-death consequences in a conflict where Israel already faces credible accusations of genocide.

According to reports in Yedioth Ahronoth and Ynet on Tuesday, the chatbot is modelled on ChatGPT but integrated into Israel's private military networks. Officers can type natural-language queries and receive detailed, real-time responses from the army's massive operational cloud. The tool is already in use in all Israeli military command centres and is being applied during live operations in Gaza.

Described by Israeli media as resembling "a clean web page with a text box in the middle and the title: 'What interests you?'," Genie offers Israeli officers instant answers to open-ended questions, drawing on constantly updated data from across the Israeli army's operational systems.

It was launched a month ago as a trial app within the Israeli military's closed internal network, and is already deployed in all command centres. A mobile version is in development, Ynet said.

Named disturbingly after the wish-granting figure from Aladdin, Genie is being deployed in a war where over 51,000 Palestinians - the majority women and children - have been killed in Israel's relentless bombardment of Gaza. The chatbot doesn't merely retrieve data; it identifies anomalies, summarises events, and generates operational insights—in essence, assisting Israeli commanders in selecting targets

The Israeli army insists that Genie is still in a "trial phase" and doesn’t autonomously make decisions. But this disclaimer masks the reality: Genie is already shaping decisions about who lives and who dies in a warzone where civilian areas, hospitals, refugee camps, and schools have all been systematically targeted.

In one example, developers explained that a commander could ask Genie which Israeli unit first raided the now-destroyed Al-Shifa Hospital - Gaza's largest medical facility - and receive an instant answer. But the use of AI in warfare is not a mark of efficiency; it is digital bureaucracy weaponised for mass killing.

The chatbot was built by a sub-unit called the "Text Factory" within the Israeli army's Matzpen division, which oversees the country's digital war infrastructure. Staffed by 20 carefully selected elite soldiers, many with advanced degrees in AI and data science, the unit operates "exactly like a high-tech startup", according to its commander, identified in Israeli media only as Captain D.

"Today, when a commander wants to know, for example, who was the first team to raid Al-Shifa Hospital, they ask themselves: where can I find this information? It's hard to know which system holds the answer. This is inefficient. Our basic insight was: if it's not simple, it won't be effective. A commander in the field cannot be entangled in complex procedures," Captain D said in an interview with Yedioth Ahronoth.

He added that Genie was also connected in real time to all operational systems, updating automatically with each new document uploaded or edited.

Gaza War
Analysis
Live Story

'A mass assassination factory'

Genie is just one part of Israel’s growing arsenal of AI-based killing tools. In December 2023, The Guardian revealed details about another secretive Israeli platform known as 'Habsora' (The Gospel) — an AI-assisted target-generation system that accelerated what sources inside the military describe as a "mass assassination factory".

The Israeli army's Target Administration Division - formed in 2019 - uses Habsora to produce daily kill lists. One official described the system as generating 100 new targets per day, compared to just 50 annually in previous operations. These included the private homes of suspected Hamas members, regardless of their rank or strategic value.

According to intelligence sources cited in +972 Magazine and Local Call Hebrew outlet, the system assigns "collateral damage scores" to each strike, estimating how many civilians might die. This data is reviewed rapidly, often with little scrutiny, before approval.

A former targeting officer said that civilian deaths were not a deterrent. "We killed what I thought was a disproportionate amount of civilians," the source told Local Call. "The emphasis is on quantity, not quality."

Another insider said commanders are judged by how many targets they generate, not by how lawful or accurate the strikes are.

Even when a human eye is "in the loop", experts warn this amounts to little more than rubber-stamping machine-generated kill orders.

While Israeli officials tout AI as a way to improve precision and reduce civilian harm, the evidence points the other way. AI has been weaponised to industrialise the process of killing amid a campaign that leading experts have described as genocidal.

This automation fits seamlessly into a broader agenda echoed by far-right Israeli leaders and Donald Trump's administration, one that envisions Gaza emptied of its population through mass expulsion - or extermination - to make way for permanent Israeli control over the Palestinian enclave.