Bakhtawar Iftikhar-AI War-MDS-JPG

As militaries around the world contemplate the use of Artificial Intelligence (AI) in warfare, the onslaught of Israel in the Gaza Strip offers an important point of reflection. This article explores the AI system called ‘Gospel’ or ‘Habsorah’, pioneered by Unit 8200 – an Intelligence Corps Unit – of the Israeli Defense Forces (IDF). It asserts that instead of being mesmerised by technological advancements and hastily deploying them, it is crucial to assess the potential pitfalls involved.

While the use of AI in conflict was first demonstrated in 2021 during Operation Guardian of the Walls, more information regarding Gospel has come to light in the recent conflict. Gospel is a target-generation system based on Machine Learning (ML), using Big Data which combines a myriad of information derived from ‘human intelligence (HUMINT), signal intelligence (SIGINT), visual intelligence (VISINT), geographical intelligence (GEOINT)’ etc. According to Blaise Misztal, Vice President for Policy at an institute that facilitates US-Israel military cooperation, this would include ‘cell phone messages, drone footage, satellite imagery and seismic sensors’.

Subsequently, the data is used to supply a list of suggested targets for the Military Intelligence’s research division. The list holds traces of alleged ‘operatives’ or infrastructure affiliated with Hamas or Islamic Jihad. The human commander or analyst then decides whether or not to act upon this intel. Since Gospel operates faster than a team of intelligence officers, the military has accelerated the rate at which targets are produced. Now, it can generate 100 targets per day, while it could only target some 50 per year in the past thus, becoming an infamous ‘mass assassination factory’.

Gospel is controversial for several reasons. To begin with, algorithms are ‘notoriously flawed’. The AI-generated suggestions could either be completely oblivious to nuances where a human officer would show restraint, or fraught with biases that the human officer would not be cognizant of. Either way, the BlackBox nature of AI is an inherent problem, which undermines transparency and is reason enough not to take AI’s word as gospel – let alone act upon it to attack densely populated urban settings as IDF have.

Furthermore, and worse still, even if the AI-generated list was perfect, innocent civilians are deliberately targeted by some ‘trigger-happy’ commanders in charge of the review. Even the IDF’s Air Force chief openly claimed that the military’s approach is not ‘surgical’. This deliberate striking of public buildings and private residences as ‘power targets’ or ‘matarot otzem’ to create shock is particularly alarming, especially since the Israeli military prides itself on ‘precision’ and technological prowess. The incessant bombing has disproportionately killed over 20,000 and displaced over 1.9 million people. Hence, instead of merely questioning the efficiency of Habsora, one should also question the Israeli military’s internal protocol that conveniently neglects humanitarian obligations, and uses AI as a ‘technological cover’ for its genocidal campaign.

Additionally, training data comprising these ‘target banks’ evokes parallels with Israeli surveillance practices. The market value of Israeli weaponry is built on the dismal commodification of Palestinian lives, who are surveilled enmasse and then treated as lab rats in a Palestinian Laboratory, to produce ‘battle-tested’ technology. Overall, this dehumanisation of innocent Palestinians has significantly tarnished Israel’s reputation, increasing the price of what is being termed as a ‘victory’.

Thus, Israel’s Gospel is a vivid yet abhorrent glimpse into the dark side of AI-enabled warfare, for it will always be associated with one of the most brutal military campaigns in history. In fact, the reality is that responsibility cannot be attributed to AI since the onus lies on humans hastily employing erroneous systems without due diligence and with willful ignorance. Therefore, just as world leaders must call for a permanent ceasefire in Gaza, and hold Israel accountable for war crimes at this critical time, they must also expedite the development of global AI legislation and safeguards, with all important actors on board. Such regulations should also entail clauses that prevent the spread of surveillance culture for military use. Until usage protocols, efficiency and fairness of AI systems are not transparently established, it can be catastrophic to employ imperfect and opaque systems in high-stakes environments like war. Moreover, regulations could help ensure that even near-perfect AI in the future is not weaponised to commit gross atrocities.

In the same vein, state militaries should ensure that the fog of war does not cloud their judgement; AI systems should not be integrated with militaries at the cost of their humanitarian obligations, but rather to ensure compliance by minimising civilian harm. The aim is not to stigmatise technology, but to prevent large-scale human suffering like that of the Palestinians. To conclude, technological advancements indeed reflect the remarkable human ability to innovate, but it is also up to us to exercise responsibility and prevent AI from becoming an oppressive automaton.

Bakhtawar Iftikhar is a Research Assistant at the Centre for Aerospace & Security Studies (CASS), Islamabad, Pakistan. She can be reached at [email protected].

Design Credit:  Mysha Dua Salman


Share this article

Facebook
Twitter
LinkedIn

Recent Publications

Browse through the list of recent publications.

The Cover-up: IAF Narrative of the May 2025 Air Battle

Even after one year since the India-Pakistan May war of 2025, the Indian discourse regarding Operation Sindoor remains uncertain under its pretence of restraint. The Pahalgam attack on 22 April, which killed 26 people, triggered an escalatory spiral. New Delhi quickly accused Pakistan-linked elements, while Islamabad refuted the allegation and demanded an independent investigation. On 7 May, India launched attacks deep inside Pakistan under what it later termed as Operation Sindoor. The political motive was intended to turn the crisis into coercive signalling by shifting the blame onto the enemy and projecting a sense of military superiority.
This episode, however, began to fray immediately as war seldom follows the intended script. Within minutes PAF shot down 7 IAF aircraft including 4 Rafales. On 8 May, Reuters reported that at least two Indian aircraft were shot down by a Pakistani J-10C, while the local government sources reported other aircraft crashes in Indian-occupied Jammu and Kashmir

Read More »

Why the IAF’s Post-Sindoor Spending Surge is a Sign of Panic

After Operation Sindoor, India is spending billions of dollars on new weapons. This is being taken by many people as an indication of military prowess. It is not. This rush to procure weapons is in fact an acknowledgement that the Air Force in India had failed to do what it was meant to do. The costly jets and missiles that India had purchased over the years failed to yield the promised results.

Sindoor was soon followed by India in sealing the gaps which the operation had exposed. It was reported that Indian Air Force (IAF) is looking to speed up its purchases of more than 7 billion USD. This will involve other Rafale fighter jets with India already ordering 26 more Rafales to the Navy in 2024 at an estimated cost of about 3.9 billion USD. India is also seeking long-range standoff missiles, Israeli loitering munitions and increased drone capabilities. Special financial powers of the Indian military were activated to issue emergency procurement orders. The magnitude and rate of these purchases speak volumes.

Indian media and defence analysts have over the years considered the Rafale as a game changer. When India purchased 36 Rafales aircrafts at an approximate cost of 8.7 billion USD, analysts vowed that the aircraft would provide India with air superiority over Pakistan. Operation Sindoor disproved all those allegations. Indian aircraft did not even fly in Pakistani airspace when the fighting started. India solely depended on standoff weapons that were launched at a safe distance. The air defence system of Pakistan, comprising of the HQ-9 surface-to-air missile system and its own fighters, stood its ground.

Read More »

May 2025: Mosaic Warfare and the Myth of Centralised Air Power

Visualise a modern-day Air Force commander sitting in the operations room, miles away from the combat zone, overseeing every friendly and enemy aircraft and all assets involved in the campaign. In a split second, he can task a fighter, reposition a drone, and authorise a strike. In today’s promising technological era, he does not even need an operations room; a laptop on his desktop will suffice. The situation looks promising as it offers efficiency, precision, and control. The term used for such operational control is ‘centralisation’, which has been made possible with advanced networking, integrating space, cyber, surveillance, artificial intelligence, and seamless communication, enabling a single commander to manage an entire campaign from a single node. Centralised command and control, championed by the Western air forces and then adopted by many others, has thus been seen as a pinnacle of modern military power.
The concept of centralisation, enabled by state-of-the-art networking, may seem promising, but it is nothing more than a myth.

Read More »