Medusa FPS


Medusa FPS is a variant of a First Person Shooter (FPS) game genre: the gameplay is a gun combat experienced through the eyes of the gun-holder. In this game, the gun is an AI-aided robotic weapon that helps to determine when to shoot, fires automatically on enemies within its field of view and guides the player’s hand to aim more effectively. The player cannot drop the weapon or stop it from firing, but she can obstruct her (and the gun’s) vision. The object of the game is to shoot as few people as possible.


By now we are all familiar with the threat of perpetual visibility created by various tracking technologies, and the corresponding threat of being targeted. Most of us are targets of ad campaigns rather than military raids, but sophisticated smart-weapon-systems rely on the information supplied often by similar, if not the same, surveillance systems. Behind a lot of computer vision and face tracking research is the desire to pinpoint our physical presence to our digital trail. And of course this kind of seeing is not just passive or uninterested, it’s possessive. Being seen is being targeted. In military raids this takes on a more sinister connotation. Mark Dorrian writes that the name of the Gorgon Stare drone-based surveillance system (and its Medusa head logo) points to ‘the real desire which is the collapse of the acts of seeing and killing into one another, the conferral of death in the moment of visualization.’

Medusa was a monster in Greek mythology, a Gorgon known for having snakes as hair, and for the ability to turn onlookers into stone by her gaze. She was beheaded by Perseus, who then used her head, which retained the ability to petrify, as a weapon.


The algorithmic ‘smartness’ of the machine, an issue that popularized the philosophical trolley problem applied to self-driving cars, implies even more troublesome distribution of agency and ethical accountability when we consider autonomy of a weapon system. ‘In the case of robotic weapon,’ Dorrian writes, ‘the sense of where agency resides tends to remain distributed across a complex concatenation of human and non-human actors. It is perhaps unsurprising that this should be the case with offensive weapons, for such dispersal inevitably minimizes the possibility of identifying a single, fully accountable agent and seems to drain away culpability itself. As drone pilot Matt Martin commented, following the mistaken killing of two boys on a bicycle: “The responsibility for the shot could be spread among a number of people in the chain — pilot, sensor, JTAC, ground commander. That meant no single one of us could be held to blame.”‘

Meanwhile, the personal weapons, the ones that we all have the right to carry and purport to have full agency in using, are becoming more sophisticated and reliant on smart technology as well. A Tracking Point Rifle is a ‘smart rifle’ made by a new startup from Texas. It uses rangefinders, environmental sensors, a ‘guided trigger,’ and a host of other ‘core technologies’ to make it a ‘precision guided firearm’ that aids the shooter. (Not to be confused with the other ‘smart gun’ technology introduced in the US gun debate: guns that might be made ‘smart’ by recognizing their rightful owner.) As NPR reports, it is “so effective that some in the shooting community say it should not be sold to the public. On a firing range just outside Austin in the city of Liberty Hill, a novice shooter holds one and takes aim at a target 500 yards away. Normally it takes years of practice to hit something at that distance. But this shooter nails it on the first try.”

Here’s a rather terrifying ad for the rifle:

One feature of the rifle is its sophisticated color graphics display. Every shot can be recorded, replayed and posted on social media. “They like to post videos; they like to be in constant communication with groups or networks,” said company President Jason Schauble about the younger generation, the intended target audience for their product. “This kind of technology, in addition to making shooting more fun for them, also allows shooting to be something that they can share with others.”

Seeing is done in such a way that reveals the subject in its aspects that are used for targeting. FPS reality seen through a gun viewfinder is a reality where everything is a potential target, the killing action is implicated in the apparatus of vision and of rendering of the world itself. The apparatus of vision filters the meaning of reality via the images it produces. Ken Johnson writes about Harun Farocki’s exhibit ‘Images of War (at a Distance)’: ‘It’s a meditation on the degree to which our world, what we take for reality, is formed by recording and image-making machinery (…) Representational technology becomes an experience in and of itself, which at least partly eclipses what it purports to reveal. We live in a world of scary, reality-determining technologies. Mr. Farocki asks an increasingly urgent question: do we control our machines or do they control us? If technology is in charge, what does it want?’ It might seem a rhetorical question but it is an excellent formulation of the question about the issue of agency in the technology. It is tempting to liken Medusa’s head, severed from its body, to technology that performs the mechanics of its architecture, but that might have severed the strings to the life systems it was developed to support. Medusa’s head remains operational, and it can be easily weaponized, while her body is turned into unthinking, senseless, lifeless flesh by this separation. The beheading renders concrete the mind-body separation performed by Kant and handed down to us with many thinkers along the way reinforcing the anthropocentric dualisms of mind and body, nature and culture. Rarely do we consider humans in relation to our surroundings as biologically embodied minds, and our understanding of technology further reinforces these conventional distinctions: it is the encounter between the human and the physical system, but it springs from the mind, it is the extension of our intellect. What does the technology want? What kind of principles are guiding it? As Sigur Bergmann writes in Atmospheres of Synergy, our natural surroundings, and our body as the space we live in, “affect greatly our practices and discourses concerning justice.” Without a body to give it the morally differentiated perception, what kind of justice is possessed by smart technology?

Medusa’s head, separated from her body and turned into a technology, can serve as an analogy for thinking about modern biotechnology. “Engineering of life forms has been marked by one discerning feature — the disaggregation of organisms into its constituent parts such that each part becomes a potentially ownable component. This replaces the earlier ‘teleological conception of an organism” where practically all vital processes are considered to be so organized that they are directed to the maintenance, production, or restoration of the wholeness of the organism,” writes Rajshree Chandra. In other words, detached from its evolutionary history, the part acquires a “de-historicized purposiveness of its own.” This operation of purifying parts into new technological devices that can be owned in turn draws attention to science and technology projects as exercises in production of power.

Medusa’s myth might offer further analogy of the workings of smart systems of control. It is in fact the sight of Medusa that turns people into stone, not the cone of vision that projects, laser-like, from her eye as in the Gorgon Stare logo. The stare has to be met with a stare back, has to be acknowledged. It is the fear that Medusa’s image strikes in the beholder that paralyses him or her, it is the apprehension of her monstrosity. In order to use fear to strike, the threat has to be produced – the source of the future fear has to be revealed to the beholder. Once generated, threat can be a supremely powerful weapon. Its power is fueled by its nonexistence in the present. “Threat is from the future,” writes Brian Massumi, “It is what might come next. It is not, in a way that’s never over. Threat is not real in spite of its nonexistence. It is superlatively real, because of it. Threat has an impending reality in the present. This actual reality is affective.” Massumi’s example of superlative reality of threat is the George W. Bush’s justification of the invasion of Iraq, despite the lack of evidence of the alleged weapons of mass destruction. In 2004 Bush admits this knowledge was available, but argues that invasion was the right thing to do anyways: America is safer because we removed its enemy who had the capacity to produce such weapons. Massumi paraphrases: “The invasion was right, because in the past there was a future threat.”

The Medusa FPS project hinges on the conflict created between the player and her in-game character. Virtual environments have allowed us to create and play out multiple personas, and thus allow for potentially creating internal dialog between those. The POV perspective used by the game convention helps to set up such a confrontation here. The vision is shared between the player and the character, and is a place of contention of eithers agency. The player uses what is available to him in the game mechanics to act against his own in-game character.

*Medusa FPS is supported by the Rhizome Commission.