Drone Strikes + Drone Art: Technology as a Double-Edged Sword

The best of both worlds.

Gita
CodeX

--

Photo by Ruslan Ivantsov
Photo by Ruslan Ivantsov

I was hiking up Mount Tammany with a few friends last summer. We reach this beautiful overlook. We decide to take a break to observe the view. Suddenly, a buzzing sound catches my attention. Soon enough, an unmanned aerial vehicle appears several feet in front of me, diverting my attention from the scenic landscape.

“So this is what it must feel like to be in an episode of Moving Art,” I thought to myself.

For the longest time, I told people that this was my favorite Netflix show: a nature documentary without any narration and accompanied by classical music. (My sister Siri credits Moving Art for introducing her to the music of one of her favorite artists — Ludovico Einaudi.)

Oftentimes, I would be met with inquisitive looks after explaining the premise of the show: the drone flies through landscapes, and captures “moments in nature, rather than the life and death of those in it.” ⁵ When humans and animals appear in the frame, it takes you by surprise.

My fondness for Moving Art grew the summer after high school during an existential crisis I was having before the start of college. Yet, I didn’t exactly rationalize why I enjoyed watching the camerawork of a drone flying around the world, or the footage of a timelapse of a flower. In fact, I sort of forgot about Moving Art altogether until last fall during a class I had taken on artificial intelligence and algorithmic data mining when the topic of drones came up as an example of a technology of war.

I believe that I first learned about drone strikes in the context of a middle school history class and also through information disseminated in news reports that commented on the utility of drones in the capture of Osama bin Laden and the surveillance of al Qaeda camps.

Constantly hearing about drones in the context of warfare in the Middle East — with the U.S. most often targeting terrorists in Pakistan, Afghanistan, Somalia, or Yemen — I felt that drones were always distant, its importance exercised in another part of the world, far away from me. I am protected by what Jennifer Rhee in her book The Robotic Imaginary calls “the here”, a “protected and privileged” space “where it is purportedly unthinkable that such extreme violence can occur.” ⁷

I now realize the irony of rendering drones as existing in a different world because drones themselves challenge proximity and reorganize “distance/intimacy”.⁷ This conception of drone warfare as sort of ‘irrelevant’ to my life afforded me the privilege to not formulate a fear of this technology, simply because my exposure to drones in-person were serving a purpose outside of a militarized framework.

I grew up in an environment that allowed me to associate drones as technologies of therapy more than technologies of racial violence and dehumanization. In high school, I would watch drone footage of scenic landscapes during my study breaks — it was my ‘escape’ from my current reality. It afforded me the opportunity to at least imagine what it would mean to explore a space that was completely different from the one I lived in.

However, I soon discovered that many other people also use drone footage as a mode of relaxation and ‘escapism’. Drones facilitate therapeutic practices: people with disabilities and challenges in mobility use drones to help them reconnect with motion and dynamism that may lack in their everyday lives — war veterans fly drones to alleviate symptoms of PTSD.³

In reading Rhee’s Robotic Imaginary for my class on artificial intelligence and algorithmic data mining, I began to realize the appeal of drones as a therapeutic practice within its militarized framework. The fantasies of omniscience that drones enable in the psychology of the drone operator lends them the opportunity to temporarily experience how it feels to “see everything…hear everything…know everything”.⁷

Did Moving Art enable me to behold a visual perspective that would fulfill my own desires of having greater control over my life, at least temporarily?

Did I appeal to drones because of its “simultaneous invisibility and omnipresence” based on its construction of a cybernetic “human-machine hybrid” that could allow me to live in the illusion of being both present and absent from a space?⁷

It’s funny how my perspective on this show has changed after encountering Rhee’s essay. I now realize that Moving Art, in its captures of ‘exotic’ places for the purpose of entertaining a Western audience, reinforces the binary of an “over there” and “here”.⁷

The history of colonialism and imperialism that shapes present-day attitudes about drones are echoed in this show through its surveillance and exploitation of ‘foreign’ lands for monetary gain — depending on its success in the entertainment industry.

The spatial distinction that Moving Art communicates may be harmless, simply due to the fact that these filmmakers do not capitalize on the drone as the technology behind the camerawork. Plus, one can argue that the show facilitates the “ethical coexistence of others” rather than reinforcing spatial distinction.⁷ In this context, drones do not serve the purpose of inciting violence, and thus questions of dehumanization may not apply.

Yet, drone art, that specifically seeks to commemorate the lives of drone victims and to creatively represent solidarity between communities, often fall short in communicating this message because artworks such as Layton and Bailey’s Home Drone and Bridle’s Drone Shadow are cultivated on that basis that drone strikes are “exceptional violence” to someone like me in a run-of-the-mill town in New Jersey while they are quite commonplace to people on the other side of the world.⁷

Artwork that tries to communicate solidarity may be duplicitous in its message through its enactment of a binary between the “over here” and “there”.

As I close my reflection on drones, I would like to appraise this technology for its dual function. This technology that subjugates and dehumanizes populations, reduces us to such “calculable and classificatory processes” (which are questionable in their accuracy and reliability when we consider the 2010 drone strike on innocent Afghan civilians who were mistaken as terrorists) can also present itself as a facilitator of therapy.⁷ In assessing both the favorable and unfavorable consequences of drones as a technology of war that employs both violent and therapeutic practices, I began to realize how the concept of ‘technology as a double-edged sword’ applies to many other technologies besides drones.

  • Facebook and Instagram allow us to compile our opinions and photos onto a platform that allow us to digitally share our lives with friends and family, however, as John Cheney-Lipold notes, these platforms use “soft biopolitics”, control without control, by distilling our online selves into “algorithmic identities” that profit companies in assessing marketing strategies — the practice of authentic/genuine representation on social media becomes undervalued by this ‘dataveillance’.²
  • Personalized fitness tracking apps, such as Whoop and MyFitnessPal, afford us for what Natasha Schull calls “a sort of fourth-person perspective” on our lifestyle and allow us to relieve some of the responsibility of self-care, however, these same apps that aim to foster our wellness have an entrepreneurial motive in trying to get as many users as possible and thus fuel addiction — at a certain point, users can be non-consensually using these apps, and will refrain from deleting them because they rely on “self-as-database” more than their own intuition.⁸
  • Technologies of mobility such as passports and visas safeguard border crossings, yet as Simone Browne notes, these same technologies can “reinforce traditional gendered patterns of power and authority”. Biometric surveillance technologies such as facial recognition software and iris scans can efficiently calculate and measure the living body, yet these same technologies operate on the basis of white prototypicality and “raceless racism” in how they “privilege whiteness, or at least lightness”.¹
  • ELIZA and Woebot can make psychotherapy more available to people, yet humans may fall into the trap of “anthropomorphizing computers, and believing that programs understand [our] human behaviors when they don’t” — not to mention the privacy and safety concerns of these technologies.⁶
  • The most jarring example I have encountered of the double-edged sword of technology is Ava, a fictional AI from the film Ex Machina that could teach us about the social construction of human behavior, yet had the capacity/intelligence to fool, kill, and escape.

These examples of drones, social media platforms, self-tracking apps, technologies of im(mobility), and artificial intelligence prove that as long as technological innovation will promise greater efficiency and progress in our society, we will weigh costs and benefits, and will almost always overlook the unfavorable consequences of a technology because of its compelling promise of improving an aspect of human life.

Sources

  1. Browne, Simone. 2010. “Digital Epidermalization: Race, Identity, and Biometrics.” Critical Sociology 36(1): 131–150.
  2. Cheney-Lippold, John. 2017. Introduction to We Are Data: Algorithms and the Making of Our Digital Selves. New York: NYU Press.
  3. Daly, Private: David. “Unexpected Benefits of Drones: Saving Lives.” Boomer Magazine, 14 Jan. 2019, www.boomermagazine.com/unexpected-benefits-of-drones-saving-lives-and-providing-therapy/.
  4. “Drone Warfare.” The Bureau of Investigative Journalism, www.thebureauinvestigates.com/projects/drone-war.
  5. Netflix Moving Art. movingart.com/netflix/.
  6. Podcast: “The ELIZA Effect,” 99 Percent Invisible
  7. Rhee, Jennifer. 2018. “Caring.” The Robotic Imaginary: The Human and the Price of Dehumanized Labor. Minneapolis: University of Minnesota Press.
  8. Schull, Natasha. 2016. “Data for Life: Wearable Technology and the Design of SelfCare.” BioSocieties 11: Pp. 317–333.

--

--