A Classroom That Questioned Genocide
A few months after the 7 October 2023 attacks, I sat in an introductory genocide studies course at Israel’s Open University. The lecturer promised we would emerge able to explain why Israel’s campaign in Gaza could never be called genocide. His certainty set the tone for the semester.
- A Classroom That Questioned Genocide
- Testing the Argument on the Ground
- Revealing AI-Assisted Mass Strikes
- The Legal Weight of Intention
- South Africa Brings the Case to The Hague
- From Investigations to Courtroom Evidence
- Reassessing Destruction Versus Annihilation
- Why Intent Matters—and How It Is Shown
- An Academic Premise Upended
- Implications for International Accountability
- Beyond the Courtroom
He framed the unfolding war as a purely military operation, devoid of intent to destroy a protected group “as such”. Without that intent, he concluded, the legal threshold for genocide was unreachable. The distinction, he insisted, is everything the 1948 Convention requires.
Testing the Argument on the Ground
The academic reassurance clashed with what I began to document outside the lecture hall. Over two years, my investigations uncovered a scorched-earth policy executed across Gaza. Residential blocks, rather than discrete military sites, repeatedly became targets.
Evidence grew that the destruction obeyed a logic transcending battlefield necessity. Entire neighborhoods were levelled in what residents described as instantaneous erasure, not precision warfare. The gap between classroom theory and lived reality widened with each interview collected.
Revealing AI-Assisted Mass Strikes
In November 2023 our team released an exposé on an Israeli system that harnessed artificial intelligence to approve large-scale strikes. The algorithm sifted vast data sets, selecting family homes of suspected militants for bombardment within seconds.
Operational officers told us the platform drastically shortened the traditional chain of human deliberation. Its output was measured not in individual targets but in “kill lists” designed for shock effect. The deliberate conflation of civilian and combatant spaces was no accident of war.
The Legal Weight of Intention
Intent, long portrayed as the missing ingredient, surfaced in military briefings we reviewed. Command language spoke of “pressure metrics” that required rising casualty figures to coerce Hamas. The strategy relied on mass deaths to shape political outcomes, echoing the Convention’s core prohibition.
Such internal discussions contradicted the university narrative of incidental harm. They supplied a trail of statements revealing that killing civilians was not merely foreseen but instrumentally valued.
South Africa Brings the Case to The Hague
When South Africa filed its January 2024 application at the International Court of Justice, portions of our November dossier appeared in the annexes. Pretoria contended that AI-facilitated mass attacks and neighborhood annihilation exhibited a genocidal pattern.
The filing argued that Israel’s operational design met both material and mental elements of genocide. The AI programme, it stated, operationalised intent by automating the destruction of Palestinian family units.
From Investigations to Courtroom Evidence
Watching our field notes migrate into legal pleadings illustrated how fact-finding can reshape the geopolitical narrative. The classroom’s neat separation between military objective and civilian extermination no longer held under judicial scrutiny.
Judges asked pointed questions about algorithms that appear to treat dense civilian areas as undifferentiated kill zones. The discussion of machine-generated target banks made the doctrine of “proportionality” sound obsolete.
Reassessing Destruction Versus Annihilation
Defenders of Israel maintain that Gaza’s ruin, however ghastly, results from legitimate military necessity. Yet the systematic bombing of homes, schools, and medical facilities—identified by AI for their social resonance—suggests a calculus aimed at group destruction, not territorial gain.
Photographs of kilometre-wide craters and collapsed refugee camps underscore that the war’s architecture is punitive. The lived Palestinian experience mirrors crimes the Genocide Convention envisaged.
Why Intent Matters—and How It Is Shown
Genocide is distinguished by intent, but intent need not be confessed. It can be inferred from patterns of conduct. Algorithms built to maximise civilian fatalities, coupled with leadership statements framing Gaza residents as collective enemies, provide a mosaic of purpose.
Such evidence invites courts to move beyond spoken pronouncements and examine structural design. The AI system reveals a programmed will to annihilate, satisfying the very criterion my professor dismissed.
An Academic Premise Upended
Returning to that campus lecture, I see how pedagogy can shield state policy. Students were offered a moral exit: as long as soldiers say they target militants, nothing else requires scrutiny. Field data, however, dismantles that comfort.
The course promised clarity but instead provoked a deeper inquiry that led straight to The Hague. Scholarship divorced from evidence proved untenable the moment real-world documents surfaced.
Implications for International Accountability
The Gaza filings force the ICJ to confront technological warfare that outsources lethal choice to code. A ruling affirming genocidal intent would reverberate far beyond Israel, setting precedents for AI use in combat.
States experimenting with algorithmic targeting will watch closely. If devastation guided by software can constitute genocide, new legal frontiers emerge for accountability.
Beyond the Courtroom
Whatever the judges decide, the investigative process has already shifted perceptions. Journalistic traces stitched to legal arguments erode the firewall between information collection and judicial action.
For Gaza’s survivors, proving intent may not rebuild homes, but it punctures narratives that frame their suffering as accidental. Recognition itself becomes a form of resistance.

