chandler

ARTICLE    
A Drone Manifesto

  

Katherine F. Chandler
Georgetown University
Kfc9@georgetown.edu



Division and asymmetry are hallmarks of drone aircraft. The purported legality of targeted assassinations carried out through unmanned systems depends on the assumption that the onscreen enemy poses an imminent threat, delineating “us” from “them.” As surveillance networks, used to see over the hill or from tens of thousands of meters above, drones separate watcher and watched, reworking the existential quandary between the self and the world for military purposes. Drones are imagined as technologies and their teleology determined by factors that exceed what is human, even as the aircraft are accounted for as mere tools (see Department of Defense, 2013; Obama, 2013). They are named “unmanned,” a variation on the gender binary between men and women, even as they disavow the human. Conceived as such, drones play into oppositional logics that remain central to Western thinking, lethally layered over the question of who is human and what is other.

Attempts to counter drone aircraft similarly mirror dualisms that frame their use. Critics count targets as civilians, whose lives “we” must protect. For some opposed to drones, the images onscreen are akin to video games that become the operator’s framework for engaging with the world, while the machinelike system is the ghostly counterpart of once-courageous soldiers (see Benjamin, 2013; Chamayou, 2014; Singer, 2010). Paradigmatic of the failures of drone aircraft are targeted strikes attacking wedding parties, which also reinscribe a separation between the personal and domestic, opposed to the geopolitics of war. Efforts to counteract drones continue distinctions between human and machine, combatant and civilian, men and women, and domestic and international.

Drones, however, insistently fail to fit into the frameworks that overlay them, as discussed by theorists of technoscience and feminism (see Asaro, 2013; Blanchard, 2011; Suchman & Weber, in press). Networked operations between human and technology underlay the operation of unmanned aircraft. Decisions in these networks rely on multiple actors that extend beyond the individual. The transnational cooperation implied by drone missions challenges territorial boundaries and reworks sovereignty, just as the strikes transform the limits of war and who is a target. Looking closely at drones reveals their connection to a myriad of changing parts: sensor operators, image analysts, legal counsel, ground forces (see Gregory, 2014). More than an aircraft, it might be better thought of as an information system, reliant on satellite, video, radio, and data exchange. Drones are not unique in the ways they network together parts that defy coherent selves, strict boundaries, or territorial limits, though their poignancy may stem from the ways they make apparent the political and ethical challenges produced by these crossings and limitations of current frameworks for addressing such confusions.

Given that drones are irreducible to human and machine, “us” and “them,” “here” and “there,” why do these divisions persist in discussions of and responses to unmanned aircraft? Why dissociate the connected parts? I turn to Donna Haraway’s 1985 [1991] essay “A Cyborg Manifesto” to explore how its call to reform binary worldviews might be applied to unmanning. Indeed, today’s drones might be cyborgs, a point that underscores the text’s cautionary reminder that the synthesis between human and machine it celebrates is first and foremost a product of the Cold War military-industrial complex. Yet cyborgs and drones remain bastards, never acknowledged for their mixtures. Haraway asserts:

In the traditions of “Western” science and politics—the traditions of racist, male-dominated capitalism; the tradition of progress . . . the tradition of reproduction of the self from the reflections of the other—the relation between organism and machine has been a border war. (p. 150)

The cyborg myth she proposes instead calls for “transgressed boundaries, potent fusions, and dangerous possibilities” (p. 154), articulating a politics that works against the ideals of identity, gender binaries, claims to human singularity, and political divisions between public and private. That drones and the warfare waged with them might characterize the “dangerous possibilities” called for by “A Cyborg Manifesto” causes me to read the two in tandem, even if this may be a perversion of the political potentials found in the manifesto. Transgressed boundaries, however, cannot be disentangled from the military-indsutrial complex and the state, although it is salient that both never recognize drones or cyborgs for their violations. Even public responses to drones often rely on these same dualisms, if only in reverse, underscoring their dehumanizing effects.

Rather than reproducing boundaries between human and other to counter drone aircraft, I instead articulate more thoroughly the connections and confusions unmanned systems create. They make manifest how humans and technologies are coproduced in ways that transform, extend, and refashion limits. By reading drones as cyborgs and situating the history of the systems within the discourse of cybernetics, I explore how drones are not dualistic but instead dissociate the connected parts they link. I argue an effective challenge to the problems raised by unmanned aircraft must explore the contradictory logics that make targeted killing possible, as well as the disconnections they produce. This analysis has two parts. The first examines the jet-powered drone aircraft known as the Firebee, developed in the Cold War and used as training targets for aerial combat. I show how, by borrowing from the cybernetic discourse of the period, drone aircraft are presented as automata even though humans remain necessary for their functioning. This disjuncture persists as unmanned aircraft develop in the following decades and contemporary drones proliferate. Turning to two contemporary responses to unmanned aircraft—George Brant’s play Grounded (2013) and an action by the feminist antiwar organization CODEPINK, “Drone Strike on White House Wedding” (2014b)—the second part of the paper explores how the separation between human and other also frames reactions against drones, as well as the gendered corollaries layered into this binary. Finally, I consider how the human-machine synthesis Haraway proposes might offer another approach to the critique of targeted killing.


Drone disassociations: Cybernetic monads and so-called unmanned aircraft

Experimental efforts to build drones have paralleled the development of flight in the twentieth century (Armitage, 1988; Chandler, in press; Mitchell, 2010; Newcome, 2004). Notwithstanding the term’s ubiquity today, the name “drone” within the military referred to target planes that trained antiaircraft gunners, an ongoing use dating to the 1930s. Whereas early military designations classified guided missiles, target drones, and pilotless aircraft in the same category, designations provided by US Congress during efforts to oversee unmanned aircraft between 1987 and 1988 established differences between unmanned platforms and missiles (see Mosier, 1988; Parsach, 2016). Today’s unmanned aircraft differ significantly from earlier systems, incorporating digital computing, composite materials, and satellite communications into the platforms, yet drones from the Cold War provide a figure for human and machine relations that presages those of today.

Most cybernetic research from the science’s mid-century height focused on the analogy between humans, animals, and machines, not the fusion between them (see Bowker, 1993; Galison, 1994; Kline, 2009). Early efforts to promote drone aircraft in the Cold War borrowed from these analogies, using figures of machine and animal to attempt to emphasize the autonomy of the unmanned plane, minimizing the role of the engineers, technicians, and operators necessary for their functioning. Conceived as machine replacements for piloted planes, drones are presented as if they operate in response to their environment as automata. Below, I examine this history to consider how drones dissociate human and machine—both integral to their operation—and the ways that “black box” controls hide these relations. 

A key example of Cold War drones is Firebee targets, which are used to this day by all branches of the US military for training surface-to-air, sea-to-air, and air-to-air defenses. The unmanned targets were initially designed in 1948 and built by Ryan Aeronautical in San Diego, California. “The Bee with an Electronic Brain,” an article published by the manufacturer on 15 March 1953, introduces the drone to the American public in the company’s magazine. As the article describes it, “The spectacular Ryan ‘Firebee,’ from which the curtain of secrecy was recently lifted by the Department of Defense, is America’s newest turbo-jet, pilotless target drone, capable of near sonic speeds at high altitudes” (1953, p. 12). The jet-powered drone was produced as military aircraft engineered during World War II became obsolete. As a training target, the unmanned aircraft was used to simulate air attack by high-speed jets built to fly faster than the speed of sound. Unlike drones from the World War II era that were either based on model airplanes or modified piloted aircraft, Ryan target drones were designed and engineered to be operated without a pilot. “The Bee with an Electronic Brain” builds on the likeness between unmanned aircraft and the bee established by military researchers working on drones during World War II, but adds an “electronic brain,” further analogizing machine and organism through the drone’s programmed responses. 

Although the Firebee was engineered as if it would fly on its own, the article explains that “by use of the ground remote control station, the ‘nolo’ (no live operator) aircraft can be flown out-of-sight at high altitudes, while other men on the ground track it by electronic devices” (p. 12). Significantly, when the author introduces humans, they are described as being on “the ground,” their relation to the drone is mediated through information. Control of the Firebee is explained in the following passage: “Responding to ghostlike controls that may be miles away, Ryan Firebee flashes across the sky, ready to simulate fighter plane tactics in sharpening anti-aircraft defenses” (pp. 12–13). A passive controller—presumably a human being—is figured as a phantasm, motivating the action of the Firebee flashing across the sky. The description registers the operator as at once distant and disembodied. The Firebee apparently acts on its own even though the specter of human involvement remains. As the article later explains, a black box, not the human operator, organizes the electronic transmissions sent to the drone, “governing” the mechanical functions of the Firebee.

The use of the terms “black box” and “electronic brain,” as well as the naturalized description of a technological system, all draw from cybernetics. “Behavior, Purpose and Teleology” (Rosenblueth et al., 1943) served as a foundational text for the theory, and one of this paper’s authors, Norbert Wiener, later proposed the term, drawing on the Greek word for “steersman” to name the new science. Wiener positioned cybernetics as a multidisciplinary approach to the study of control and communication. Aligning organisms and technologies, cybernetics studies how systems of inputs and outputs act in response to their environment. Historian of science Peter Galison argues that cybernetics articulates humans, animals and machines as “a universe of black box monads” (1994, p. 265). I examine how the monadic unit ties to the Firebee’s flight, showing how it is described as if it were governed by inputs and outputs, and indicating how these relations elide human and machine. This analysis adds to previous accounts of the “black box” in cybernetics, showing how the Firebee's control system produces a confusion between who or what responds to external conditions. Drone action was explained as a machine’s response to its environment, even through the ghostly human operator was integral to its operation.

Another press image and caption (Figure 1), also produced by Ryan Aeronautical in 1953, linked the black-box control of the Firebee to the system’s onboard electronic brain.

Like a released parasite, the Ryan Q-2 pilotless drone target plane is launched from its B-26 “mother” plane and streaks out over the desert under its own power during U.S. Air Force development tests at Holloman Air Development Center, Alamogordo, N.M. Speed and maneuverability of the “Firebee” are controlled from the ground by means of a black box remote control which transmits command signals to its electronic “brain.” (Ryan Aeronuatical, 1953)

The caption presents the drone as at once a bee, a parasite, and a baby before explaining how the “electronic brain” and “black box” operate. The Firebee was either catapulted from the ground or released from pylons on a converted cargo plane (as shown in Figure 1) and landed by parachute. It might have been characterized as a parasite or a baby because the drone cannot perform two of the functions most basic to piloted aircraft—takeoff and landing. After presenting the system as a parasitical technology, though, the next part of the caption ties the Firebee to the desert below and seems to presume its separation from its “mother.” Written in the passive voice, the “black box,” not a human operator, transmits command signals to the drone’s “electronic brain,” suggesting the system’s apparent autonomy. Here, the reader is invited to think of the drone as behaving in response to inputs and outputs, transmitted from the ethereal landscape. The dependency implied in the first sentence of the caption is cycled into the cybernetic operation that occurs in the second sentence. Responding to the inputs the Firebee parasitically uses, the aircraft streaks across the desert, as if it were controlled by an electronic brain.

A black and white front and back scan of a photograph. The front scan shows a drone and small plane mid-flight above hills. The back scan shows a description of the image. It states: "Like a released parasite, the Ryan Q-2 pilotless drone target plane is launched from its B-26 'mother' plane and streaks out over the desert under its own power during U.S. airforce development test at Holloman Air Development Center, Alamogordo, N.M. Speed and maneuverability of the 'Firebee' are controlled from the ground by means of a 'back box' remote contol which transmists command signals to its electronic 'brain.'"

Figure 1. A/BQM-34 Technical Files, Smithsonian National Air and Space Museum, NASM-9A13518 NASM-9A13518-A.

In “Behavior, Purpose and Teleology” (Rosenblueth et al., 1943), the authors observe that “a torpedo with a heat-seeking mechanism” might be “intrinsically purposeful” (p. 19) as its response is guided by reaction to heat. Cybernetics describes action relationally, between object and environment. The focus is on the singular relation between the organism or machine and its environment, rather than their fusion. In the description of Firebee, the black box transferred information between operators and the drone. The drone’s movement displaced this interaction, which made the system seem self-propelled, separate from the controller on the ground guiding its flight.


“Black-boxing” has been widely discussed in science and technology studies. Donald MacKenzie (1993) provided one commonly used framework for the concept. He defined “black box” through a quote by Charles Draper, founder and director of the Massachusetts Institute of Technology’s Instrumentation Laboratory. Draper explained that the black box was an ideal arrangement of a self-contained unit that would not be affected by external conditions. For MacKenzie, the more specific meaning proposed by Draper ties to a broader definition of the black box: “It is a technical artifact—or, more loosely, any process or program—that is regarded as just performing its function without any need for, or perhaps any possibility of, awareness of its internal workings on the part of users” (1993, p. 26). MacKenzie comes to this definition by showing how the guidance system developed in Draper’s laboratory troubled the idea of an apparently self-contained system, arguing that the guidance technologies were inextricable from their social, scientific, military, and political context. Extrapolating from this analysis, MacKenzie writes that “the more deeply one looks inside the black box, the more one realizes that ‘the technical’ is no clear-cut and simple world of facts isolated from politics” (p. 381).

I want to add to an analysis of the obfuscation that occurs through “black-boxing” by considering the cybernetic system of inputs and outputs that ostensibly controlled the Firebee. “Behavior, Purpose and Teleology” explains organism and mechanism as monadic units responding to their environments through inputs and outputs. In this case, the black box does not stand for just self-containment; it is also a system for organizing information. A dictionary definition of the black box explains that it is a “device which performs intricate functions but whose internal mechanism may not readily be inspected or understood; [hence] any component of a system specified only in terms of the relationship between inputs and outputs” (OED Online, 2016). What is important about these black boxes is that they do not only propose the self-containment that Draper emphasizes, but a singular relation to external conditions.

The black box controls on the Firebee isolate the technical from the political by displacing human control onto the action of the organism-like machine and vice versa. The Firebee acts as if it responded to its environment, even though examination of the inputs and outputs shows that this feedback loop conflates human and machine to produce these reactions. Returning to “The Bee with an Electronic Brain,” the article explains that the “push button heart of the Firebee project is a small ‘black box’ containing a control stick and switches to govern engine speed and other flight conditions, and to transmit control signals to the drone” (1953, p. 13). Here, the black box organizes the inputs and outputs between human and machine to set up the behavior of an apparently singular unit. Described as “the heart” of the Firebee, the control unit linked the output of the drone, “any change produced in the surroundings by the object,” to the input, “any event external to the object that modifies this behavior in any way” (Rosenblueth, et.al., 1943, p. 17), relayed by radio transmission from the operator. The black box created a cybernetic system, even as the control unit, which linked human and machine, undid its “monadic” structure. The pre-programmed “electronic brain” added another layer to the Firebee’s operation, automatically stabilizing the response of the aircraft to the input of the controller, as determined by calculations made beforehand. Programs and signals that layered together human and technical control made the aircraft's flight possible, even as the Firebee apparently “streaked out” across the sky on its own.

The controls of the Firebee organized human and nonhuman behaviors to create a cybernetic system, even as the “black box” functioned to separate human control from technological action. The elision of human engineering, design, and control with a behavioristic model of technology provided the conditions for the concept of “unmanned” to emerge. The name shows how remotely operated aircraft were distinguished from piloted flight by acting as if no "man" were necessary, though both rely on interactions between humans and machines. The drone is a cyborg, yet the connection between operator and aircraft is obscured, understood instead as inputs and outputs filtered through a black box. Haraway (1991) asks the reader to imagine and realize human-machine fusions in ways that escape categorical understandings of these terms. In her essay, the cyborg is slippery and ironic, a bionic man and a woman of color. Responses to "A Cyborg Manifesto" as such have articulated the critical limitations of these fusions, notably in Galison’s (1994) analysis of cybernetics and its wartime origins. Yet these reactions reproduce divisions between human and machine, rather than working through the stakes of their confusion. In the development of early drone aircraft, interconnections between human and machine are obscured through the monadic rationale of cybernetics. While organism and machine were analogized, connections between humans and machines were literally black-boxed and the role of the operator became ghostlike and spectral.


Drone syntheses: Beyond the “domestic” critique

In this section, I take up how dissociation between human and machine has continued in contemporary unmanned systems, extending my analysis to gender and how these categorizations map onto characterizations of drones in domestic and international politics. The contribution of Haraway’s cyborg myth is not to mitigate the lethal potential of drone aircraft, but to insist these challenges can only be addressed by considering the networked connections they both produce and are formed by. “A Cyborg Manifesto,” by turning away from origins and wholes, calls for a politics that does not resolve but rather is parsed out through the complexity of socio-technical relations. Above, I indicate how “black-box” controls hide the operation of unmanned systems, which enable drones to act as though they were autonomous. The cyborg approach I advocate, on the other hand, considers how humans and machines act in concert and underscores contradictions that come about when they are separated.

Contemporary headlines, like the early press releases for target drones, portray the aircraft as if unmanned systems act on their own. Consider a recent article, “Hacker Killed by Drone Was Islamic State’s ‘Secret Weapon’” (Coker et al., 2015). The headline describes how a "drone" targeted and killed suspected Islamic State operative Junaid Hussain, naming him not as a person but as a weapon. Here the entire drone strike is figured as a black box, described as machine-like response. Of course there are human operators involved in the drone’s mission, but their role and that of intelligence personnel, likely from the United States and Britain, remain obscure, as do connections to image analysts, legal counsel, and commanders, and the regulations that underwrite such a strike. According to the article, “the drone” apparently responds to the target, rather than to a network of people and information, which includes participants from the United States, Britain, and likely beyond. Such accounts continue the cybernetic framework of a singular feedback loop set up between “drone” and “target,” erasing the involvement of human operators and their role in killing Junaid Hussain.

A similar logic is at play in efforts to counter the use of drone aircraft, which also divide between human and machine, here and there, “us” and others. Below, I examine two responses to the military use of unmanned aircraft: Brant’s play Grounded (2013) and CODEPINK’s “Drone Strike on White House Wedding” (2014b). These pieces both raise important questions about the use of drones, reflecting on post-traumatic stress disorder (PTSD) experienced by operators and the death of civilians in missile strikes carried out by unmanned systems (see also Brandt, 2013). The counternarratives they propose have received widespread attention and challenge the purported success the US government claims for the drone program. The attempts to counter drone aircraft nonetheless reproduce dissociations explored in the previous section. By emphasizing how “drones” negatively affect “humans,” the accounts underscore divisions between human and machine rather than addressing the networked connections that are the basis of unmanning. Further, these challenges to drones and the global asymmetries they make manifest rely on images of women, children, and the nuclear family that conform with ideals of Western, white subjects and “others” they envision. They mirror in these way rather than undermine logics that make drone war possible

Brant’s play, Grounded, first performed in 2013, is a one-woman show about a fighter pilot turned drone operator. The work challenges gendered assumptions about both professions, portraying a female character who by turns is tough, driven, and loyal. After becoming pregnant, she leaves her career as a fighter pilot and joins the “chair force,” flying a Reaper MQ-9 unmanned aircraft from Creech Air Force Base in Nevada. Over the course of the play, the protagonist becomes increasingly troubled by her new position. She is caught in the monotony of grey-toned surveillance, watching for the enemy for months on end and tapped out by other pilots who participate in the continuous shift work that characterizes drone missions. She develops PTSD, which affects her relationship with her husband and daughter before she is jailed, at the end of the play, for refusing to carry out a strike against an enemy target when she sees a child in the field of attack.

The conditions enacted in the play are symptomatic of the experiences of contemporary drone pilots and the dissociative relation between the battlefield and home that they must negotiate. As P. W. Singer (2010) notes:

[Drone] operations have created the novel situation of pilots experiencing the psychological disconnect of being “at war” while still dealing with the pressures of home. In the words of one Predator pilot . . . “You are going to war for 12 hours, shooting weapons at targets, directing kills on enemy combatants, and then you get in your car, drive home, and within 20 minutes you are sitting at the dinner table talking to your kids about their homework.” (p. 32)

Significantly, though, these concerns take a separation between international conflict and domestic life as given. It is their proximity that raises challenges, suggesting the presumption that war, at least for American military personnel, is not “here.”

Obviously, this disassociation stands in stark contrast with the experiences of people under wartime occupation, who are exposed to war on a daily basis regardless of their status as military personnel or civilians. Why is it presumed that war can be waged in such a way that it will not transform the fabric of one’s life? I raise this question not to minimize the significance of challenges for military personnel who move between war and home on a daily basis, like drone pilots. Rather, I suggest that to understand this situation as a problem brought about by drone technology is to miss the particular ideas of wartime distance that it assumes. This can only be seen as a consequence of drone technology if one takes as given divisions between international and domestic spheres and the work and politics with which each is associated.

In Grounded, blue and grey code the experiences of flying a fighter plane and a drone aircraft, which illustrate how the play reproduces asymmetries between wartime actions “abroad” and what happens “here.” Speaking to Tiger, the fighter plane, at the beginning of the play, the protagonist says:

You are alone in the vastness and you are the blue / Astronauts / They have eternity / But I have color / I have blue / I’m in the blue for a reason / I have missiles to launch / I have Sidewinders / I have Mavericks / I rain them down on the minarets and concrete below me. (2013)

As a fighter pilot stationed overseas—“in the blue”—she feels no remorse for her wartime actions. Compare her affect to the account of the Reaper she flies. No longer speaking to the plane, she has become the system: “Back to the grey / It’s funny / The screen isn’t that big / But it becomes your world / Like the TV I guess / Or the computer / But the grey is / It definitely” (2013). This contrast between blue and grey marks the final moments of the play. Caught in the grey images, she confuses the child she sees onscreen for her daughter. Watching as another pilot carries out the strike that she has refused to undertake, she says:

The team cheers as my daughter dies / As her arms and legs fly off in separate directions / As her pulp is mixed with the car and the Prophet and / the sand / As her pulp dissolves into the grey/ There is only grey now / Only the grey. (2013)

What strikes me about this contrast between blue and grey, here and there, fighter plane and drone is how easy it is for the protagonist to fire missiles against “minarets and concrete.” Only seeing the child killed onscreen as her own child precipitates her crisis. Why does “blue” not seem equally chilling and problematic?

Performed as an extended monologue, the play centers on the sole female performer and her trauma, which registers in her domestic life. The only people named in Grounded are the pilot’s husband and daughter. The politics that brought about her involvement in the unnamed war, as well as her association with the military hierarchy, are spectral, as is “the Prophet,” the target she watches onscreen. The play relies on a solitary woman to stand in for a network of transnational relations imbricated by drone warfare, a negative corollary to media accounts that black-box what happens in military strikes by ascribing the action to the “drone.” Grounded reduces the protagonist’s reaction to the political and ethical challenges she experiences in a war to an individual response marked as grey, in which she sees the death of another child as that of her own. The world enacted by the play remains persistently “here” and offers no exploration of the global political relations that are central to the system she operates.

This oversight extends to activist actions that have challenged drone warfare. On Sunday, 4 May 2014, CODEPINK carried out a performance in front of the White House that became a short YouTube video, “Drone Strike on White House Wedding” (2014b). The website explains that the concept came from a Hellfire missile strike launched from drone aircraft against a procession of vehicles for celebrating a wedding in Yemen on 13 December 2013 that killed twelve civilians (2014a). In the reenactment, a group of participants gather for a staged wedding. A large, two-sided sign features the slogan “Here Comes the Bride” on one side and “Here Comes the Drone” on the other (2014b). For the performance, the blonde bride wears a simple white dress and a crown of pink flowers; her groom is dressed in a suit with a pink bow tie. They exchange vows and rings, kissing each other before reacting to a simulated attack from the sky. The participants in the action fall to the ground, prominent among them the bride and groom, who are covered in white sheets with fake bloodstains.

According to a CODEPINK press release, the performance aimed “to educate the public about how terrifying it would be to have the same thing happen in the US, and motivate people to take action against the drones” (2014a).  The concern about the strike is significant and their action builds on a report by Human Rights Watch documenting the deaths (2014).  The purported legality of these targeted killings should be questioned and I support ongoing efforts to hold the United States' government responsible for civilian deaths caused by attacks from unmanned aircraft. However, it’s striking that in raising awareness about drone warfare, the performance replaced the bodies of the targeted Yemenis with white Americans and modeled the strikes as terror in the midst of marital bliss. Why should this typify the challenges of drone warfare for people in the United States? And how is it possible that this scene is conceived as the “same thing” as what happened in Yemen?

The CODEPINK action, like the final sequence of Grounded, uses a domestic scene premised on relations “here” to question drone strikes. What this risks doing is erasing the significant asymmetries and substantially different challenges that confront people being attacked through drone aircraft, in this case in Yemen. To understand what happened on 13 December 2013 might call for closer analysis of the wedding attacked, presumably in a way that would emphasize Yemeni traditions, not those of the United States. Such an examination would also address how the drone strikes may have played into what has now become a civil war in the country, drawing on historical separations between the northern and southern regions, as well as the competing regional influences of Iran, Saudi Arabia, and the Islamic State.

Conceiving the drone as cyborg would show how the networked system challenges the territorial limits of countries. Unofficially, the American military has indicated that the strikes were carried out through intelligence provided by the Yemeni government (Reuters, 2015). Given the myriad interests in the region, this cooperation may have extended beyond these two nations. These complications do not absolve the United States of responsibility for the death of civilians. However, they do highlight how critical responses to drone aircraft should address the ways that human and machine networks linked through unmanned aircraft undo straightforward divisions. A US attack on Yemeni territory, in this case, seems indicative of the two countries’ cooperation (at least at that time), not their enmity. Correspondingly, critical responses to the targeted killings must also work between these categories. A cyborg response to the deaths of Yemeni civilians targeted during a wedding might ask how this tragedy fits into what has become a widespread humanitarian emergency, affecting up to 80 percent of Yemen’s population (see BBC, 2015). Seeing the attack as a strike against a US wedding does not help to understand this context, which more than “the drone” is cause for concern.


“I am not not a drone”

The closing sentence of “A Cyborg Manifesto” proclaims, “I would rather be a cyborg than a goddess” (Haraway, 1991, p. 181)—a refrain that charted a new course for feminisms, machines and others, emphasizing their entanglement. Reading this work in the face of human and machine fusions enacted by drone aircraft, however, I want to take seriously that this assertion for the cyborg carries with it the challenges and contradictions of the military-industrial complex, which the manifesto also outlines. Haraway offers short, dense commentary on the socio-technologies she thinks will come to characterize the cyborg state:

decentralization with increased surveillance and control; citizenship by telematics; imperialism and political power broadly in the form of information rich / information poor differentiation; increased high-tech militarization increasingly opposed by many social groups . . . close integration of privatization and militarization, the high-tech forms of bourgeois capitalist personal and public life; invisibility of different social groups to each other, linked to psychological mechanisms of belief in abstract enemies. (1991, p. 171)

I take this depiction as providing alternative avenues to critique how increasingly intricate human and machine systems are used to wage war, drones being a case in point. “A Cyborg Manifesto” insists that it is not possible to divide “us” from unmanned aircraft, instead calling for a more thorough engagement with drone networks. This means tracing out how drones deploy distinctions between “information rich” and “information poor” to justify the asymmetrical structure between who targets and who is targeted, just as it also suggests a reconsideration of critiques that emphasize “domestic” impacts of drones or innocent civilian bodies, turning instead to the permeability of boundaries between the “personal body and the body politic” (p. 170).

The drone as cyborg undoes the newness attributed to a system that was the subject of press releases during the height of cybernetics and in the early period of the Cold War. The cyborg reminds us that the problem is not drone aircraft per se, but the ways drone systems tie into ongoing practices of patriarchal capitalism, the legacy of colonialism, and techno-determinism. Insisting on the syntheses that are the basis of drones shows how popular accounts dissociate between human and machine, war and home, friend and enemy, men and women, even as the networked operations of so-called unmanned aircraft undo these categories. Examining how drones cause deaths, challenge boundaries, and rework practices of governance asks for a more intricate interrogation of the “potent fusions” they incorporate. The negation proposed by “unmanning,” which separates “us” from the drone might instead be replaced by the space of the double negative—“I am not not a drone”—and the responsibility for the human and machine synthesis that would be taken up through such a statement.



Acknowledgements

I am grateful for feedback and commentary on this article from the Worldly Matters Faculty Seminar, supported through a Global Engagement Grant from Georgetown University, as well as comments from other participants in this issue, especially Marisa Brandt, and the anonymous reviewers. Any errors remain my own.


References

Asaro, P.M. (2013). The labor of surveillance and bureaucratized killing: New subjectivities of military drone operators. Social Semiotics, 23(2), 196-224.

Armitage, M. (1998). Unmanned aircraft. London, UK: Brassey’s Defence Publishers.

BBC. (2015, 15 December). Yemen crisis: How bad is the humanitarian situation?. BBC News. Retrieved from http://www.bbc.com/news/world-middle-east-34011187/

Benjamin, M. (2013). Drone warfare: Killing by remote control. London, UK: Verso.

Blanchard, E. (2011). The technoscience question in feminist international relations: Unmanning the US war on terror. In J.A. Tickner & L. Sjoberg (Eds.), Feminism and international relations: Conversations about the past, present, and future (146-163). New York, NY: Routledge.

Bowker, G. (1993). How to be universal: Some cybernetic strategies, 1943-70. Social Studies of Science, 23(1), 107-127.

Brandt, M. (2013). Cyborg agency and individual trauma: What Ender’s Game teaches us about killing in the age of drone warfare. A Journal of Media and Culture, 16(6). Retrieved from http://journal.media-culture.org.au/index.php/mcjournal/article/view/718

Brant, G. (2013). Grounded. Croydon, UK: Oberon Books, Kindle Edition.

Chamayou, G. (2014). A theory of the drone. (J. Lloyd, Trans.). New York, NY: The New Press.

Chandler, K. (in press). American kamikaze: Television guided assault drones in World War II. In C. Kaplan & L. Parks (Eds.), Life in the age of drones. Durham, NC: Duke University Press.

CODEPINK. (2014a). Activists stage dramatic simulation of drone strike on Yemen wedding in front of White House. CODEPINK: Women for Peace. Retrieved from http://www.codepinkarchive.org/article.php?id=6717

CODEPINK. (2014b). CODEPINK: Drone strike on White House wedding. Retrieved
from https://www.youtube.com/watch?v=hizvcnL2xQE

Coker, M. et al. (2015, 27 August). Hacker killed by drone was Islamic State’s ‘secret weapon.’ Wall Street Journal. Retrieved from http://www.wsj.com/articles/hacker-killed-by-drone-was-secret-weapon-1440718560

Department of Defense. (2013). Unmanned systems integrated roadmap FY 2013–2038. Retrieved from http://archive.defense.gov/pubs/DOD-USRM-2013.pdf

Ehrhard, T. (2010). Air Force UAVs: The secret history. Mitchell Institute Study. Retrieved from http://robotpig.net/robotics-news/air-force-uavs-the-secret-history-_1895

Galison, P. (1994). The ontology of the enemy: Norbert Weiner and the cybernetic vision. Critical Inquiry, 21(1), 228-266.

Gregory, D. (2014). Drone geographies. Radical Philosophy, (183), 7-19

Haraway, D. (1991). A cyborg manifesto: Science, technology and socialist-feminism in the late twentieth century. In D. Haraway (Ed.), Simians, cyborgs and natures: The re-invention of nature (pp. 149-181). New York, NY: Routledge.

Human Rights Watch (2014, 19 February). US: Yemen drone strike may violate Obama policy. Human Rights Watch. Retrieved from
https://www.hrw.org/news/2014/02/19/us-yemen-drone-strike-may-violate-obama-policy

Kline, R. (2009). Where are the cyborgs in cybernetics? Social Studies of Science, 39(3), 331-362.

MacKenzie, D. (1993). Inventing accuracy: A historical sociology of nuclear missile guidance. Cambridge, MA: The MIT Press.

Mosier, R. (1988). DOD joint unmanned aerial vehicle program master plan–1988. Washington, DC: Tactical Intelligence Systems Directorate. Retrieved from http://www.uadrones.net/military/research/acrobat/880627.pdf

NASM-9A13518

NASM-9A13518-A

Newcome, L. (2004). Unmanned aviation: A brief history of unmanned aerial vehicles. Reston, VA: American Institute of Aeronautics and Astronautics, Inc.

Obama, B. (2013, 23 May). The future of our fight against terrorism. National Defense
University, Washington, DC. Retrieved from
https://www.lawfareblog.com/text-presidents-speech-afternoon

Parsch, A. (2016). Current designations of US unmanned military aerospace vehicles. Directory of US military rockets and missiles. Retrieved from http://www.designationsystems.net/usmilav/missiles.html#_System

Reuters. (2015, 29 January). Exclusive: US armed drone program in Yemen facing intelligence gaps. Reuters, US Edition. Retrieved from http://www.reuters.com/article/us-yemen-security-usa-exclusive-idUSKBN0L22UL20150129

Rosenblueth, A. et al. (1943). Behavior, purpose and teleology. Philosophy of Science, 10(1), 18-24.

Ryan Aeronautical. (1953, 26 February). Photograph and Caption, A/BQM-34 Technical Files, National Air and Space Museum Archives, Washington, DC.

Ryan Reporter. (1953, 12 March). The bee with an electronic brain. A/BQM-34Technical Files, National Air and Space Museum Archives, Washington, DC.

Singer, P.W. (2010). Wired for war: The robotics revolution and conflict in the twenty-first century. New York, NY: Penguin Press.

Suchman, L. & & Weber, J. (2016). Human-Machine autonomies. In N. Bhuta, S. Beck, R. Geis, H.-Y. Liu, & C. Kreis (Eds.), Autonomous Weapons Systems (pp. 75-102). Cambridge, UK: Cambridge University Press.



Bio

Katherine F. Chandler is Assistant Professor of Culture and Politics in the Edmund A. Walsh School of Foreign Service at Georgetown University. Her research is at the intersection of social and political theory, science and technology studies and new media. She is recipient of the Townsend Center for the Humanities Fellowship and the Peter Lyman Fellowship for New Media. Her current book project is titled “The Techno-Politics of Unmanning: How Humans, Machines and Media Assemble Drone Flight and Failure.”




DOI: http://dx.doi.org/10.28968/cftt.v2i1.85.g165

Refbacks

  • There are currently no refbacks.


Copyright (c) 2017 Katherine Fehr Chandler

--

ISSN 2380-3312 | If you have questions about the site, including access difficulties due to incompatibility with adaptive technology, please email editor at catalystjournal.org