#violence
Moreover, the evacuation of wounded soldiers from the battlefield became impossible due to immediate drone attacks, which left them suffering for days or even weeks at times. Perhaps autonomous lethal weapons disgust us the most from an ethical point of view. The current discussions point out that autonomous weapons are unacceptable for two main reasons: they remove human oversight that separates the decision-maker from the consequences of their choices, and they deny the human dignity of the victim. The ethics of seeing from a drone, aka god’s eye view, which enables the conduct of war from a distance, is controversial in terms of the dehumanising effect of killing as a practice of manhunting. The execution of violence is outsourced to controlled semi-robots as an efficient way to eliminate the enemy while minimising the death toll among the members of the military powers.
At the same time, generating ISIS images on Stable Diffusion does not lead to images explicitly depicting acts of violence or bloodshed, even though those images can be found with the training data according to Have I Been Trained. A prompt containing ‘ISIS execution’ or ‘ISIS beheading’ may lead to a hooded militant lying on the ground, but with no overtly disturbing elements. My first attempts to generate ISIS-related images on Stable Diffusion took place in 2023, and they produced a distinctly different set of images than the slick and sanitized results one year later. The earlier attempts led to images that had certain photorealist elements – grain, texture – that elicited a feeling of genuine disturbance, similar to what I felt when I would look at ISIS videos. In this way, these results felt more ‘authentic’ than what can be produced today. But this feeling of ‘authenticity’ was confusing because the figures in these images were deformed, some lacking limbs or in awkward postures, clearly not real. Rationally, I knew that these images did not represent actual people, but were statistical renderings derived from a large sample of image data. But they disturbed me nonetheless because their appearances bore a trace of indexical reference to real images: images of real people, real lives, real deaths.
AI-assisted targeting systems may have allowed the military to target and kill at an unprecedented scale, as international media outlets have reported. However attention to developments on the ground evidences how all the violence was the result of concerted decisions: a prime minister ordering a campaign of destruction and a military echelon eager to heed his demands.
AI has played a pivotal role in Israel’s war in the Gaza strip since October 7th 2023, when Hamas militants massacred 1,200 Israeli civilians and soldiers and took more than 250 hostage (CNN 2025). Reporting by Israeli and international news outlets have detailed how large language models and predictive analytics are helping to determine when and where bombs fall from the sky and troops shoot on the ground (Abraham 2024). Developed by Israeli military units, yet bolstered with computing infrastructure and technologies provided by private civilian firms, US military officials and AI experts say it is the first time automated systems have been used in warfare at such a large scale (Biesicker et al 2025).
No peaceful future can be built on the systematic production and elimination of targets as a primary tactic of warfare. If this ethos is not challenged more widely, we risk becoming the direct inheritors of Eichmann’s legacy. Indeed, if this ethos is instead celebrated and more AI targeting finds its way into more conflicts – as seems to be the case in our darkening times – then, as Anders warned us, it is not only possible that “the monstrous” may be repeated. It may already be on the near horizon.