Columns

XXIX.2 March - April 2022
Page: 26
Digital Citation

Seeing like a state (of surveillance)


Authors:
Jaz Choi, Roopa Vasudevan

back to top 

We met in Aarhus, Denmark, in the summer of 2019. Considering where we live physically—Roopa in Lenapehoking (Philadelphia) and Jaz in the Kulin Nation (Melbourne)—it was an odd place for us to meet. It was an odd time, too. In hindsight, it was like how we remember our last teenage summer, when everything was changing fast but the familiar patterns of our everyday lives kept us naive and carefree, or ignorant and cruel. Not long after we met, border and mobility control tightened across the world. We have connected twice since then from afar, via two pieces of writing: first, over Mimi Onuoha's "When Proof Is Not Enough" [1], which pointedly showed how data as evidence can be useful in confirming what we perceive to be true but does little to shift perceptions, much less fix systemic injustices. And second, over James C. Scott's Seeing Like a State: How Certain Schemes to Improve the Human Condition Have Failed [2], which we both reencountered after several years. The third time, lucky for us, is our having a dialogue in cowriting this column. We draw on Ted Chiang's words as we think with and through Roopa's art practice:

I don't think computers are even remotely close to human intelligence. We are decades too far in the future. You think about the number of operations that a computer can do per second, but that has nothing to do with human intelligence…. We are in the amoeba phase [3].

Jaz Hee-jeong Choi: It's a humorous image: so many humans being terrified by a big, bad amoeba. But, as Chiang notes in a different interview, the amoeba reference here is about technical advancement. While many portray AI as either heroic or tyrannical, what's actually terrifying are the "big, bad" capitalistic agendas feeding and shaping (amoebic) "algorithmic creatures" [4] and their renderings, involving strategies of surveillance and control. Roopa, your work has explored similar themes. Could you share a little more about your practice?

Roopa Vasudevan: In much of my work, I have focused on creating space for the consideration of voice, agency, and embodiment within the digital. Technology allows for—maybe even demands—multiple renderings of the self, and I am interested in probing those to find discrepancies or dissonances. What is lost when humanity is translated back and forth between the physical and the digital, when identity is abstracted into quantifiable portions that can be packaged and marketed within what Shoshana Zuboff calls surveillance capitalism [5]? What are the ineffable qualities of being human that get lost in this process? How does the surveillance that we are subjected to, day in and day out, skew the ways in which we are perceived—maybe even the ways we see ourselves?

With dataDouble (Figure 1; https://datadouble.art), I attempted to shed light on processes that are opaque and mysterious but have practical implications for how people experience the digital world. I created a Web extension that uses browsing data to alter and shape a user's photograph into a portrait of their "data double." To me, the most important part of this work was the opportunity it provided for people to view a representation of how dataveillance filters them through specific lenses, and to talk back to that—to have a chance to express what they feel is missing, or what scares them, or what doesn't bother them at all. Dataveillance is predicated on stripping humanity and complexity out of individuality and reducing it to monetary value, and doing so in a way that abstracts or hides its true effects. Putting a literal face on that—your literal face—brings those effects home a little and offers a way to return a human voice to an otherwise dehumanizing process.

An installation shot on a gray wall in an art gallery. On the wall in the center are 15 portraits in two rows, with 8 on the top and 7 on the bottom. The portraits are clearly of people, but they look very stylized and manipulated. To the left of the portraits is a small shelf with two dark blue books on it; to the right is a print of a black, green and orange QR code in a silver frame. Figure 1. Roopa Vasudevan, dataDouble (partial install view). Vox Populi, Philadelphia, PA. June/July 2021.

JHC: It's reminiscent of how scientific forestry, as described by Scott to illustrate how a high-modernist narrowing of vision made specific bits of the forest legible to serve the economic imperatives of specific people in power, failed humans (in fact, more than humans)—only in bits and bytes, and only at previously unimagined scales.


What is lost when humanity is translated back and forth between the physical and the digital?


RV: Yes, exactly. I am also interested in examining what reads as legible within both digital and analog contexts, and how the power structures inherent in big tech warp what we perceive as acceptable or unacceptable. Two of my projects approach this from very different perspectives: Syntax Error (Figure 2; https://machinereadable.art/syntax-error/) from the perspective of programming error messages, and Big Tech Says Sorry (Figure 3; https://roopavasudevan.com/artwork/#big-tech-says-sorry) through affect. In both cases, technological systems—really, the people responsible for the most dominant and powerful of these—drive the ways in which we approach behaving, and also shape our acceptance of what is legitimate or "natural." The esoteric language of the error message may feel annoying or impenetrable, but we don't really question how it is written or think about who it might be written for (although there have been recent efforts [6] to change that). At the same time, when sincere, personal, authentic apologies for an "error" (of varying magnitude) appear to come from large tech corporations, the language suddenly feels discomfiting and fake; a manufactured PR apology is expected, rather than what we might actually want: genuine accountability.

Two monitors are hanging on a wall. The image on the right hand monitor has a neon green background with dark gray text, and the image on the left has a dark gray background with neon green text. The photo is taken from an angle which makes the right side image look bigger, and also makes the left side image appear slightly out of focus. Figure 2. Roopa Vasudevan, Syntax Error (install view). Vox Populi, Philadelphia, PA. June/July 2021.
A gray wall in an art gallery. There are four framed images in the center; each image is a different Google Street View screenshot. Two posters flank the left side of these images and two flank the right; each of these posters is styled as though they were issued from Uber, Amazon, Google and Facebook. Figure 3. Roopa Vasudevan, Big Tech Says Sorry (partial install view). Vox Populi, Philadelphia, PA. June/July 2021.

I have lately found that integrating my personal experiences within these works has been an extremely generative way of pointing at these dissonances—in Syntax Error, I start from error messages that I constantly receive in my creative practice, and Big Tech Says Sorry emerged from an extended investigation into my own style of apologizing to those I feel I've wronged over the years. Like in dataDouble, I find possibility with the juxtaposition of my own tangible presence—both as an artist and as a human being living under the watchful gaze of Google, Apple, and Amazon—with the more impersonal filter of technological systems and the companies that make and perpetuate them. To me, placing those things together allows us to surface what has faded into the woodwork—how we have learned to become more "transparent" to the machine (in Glissant's [7] conception of the word) at the expense of the messiness and complexity that, in a way, defines being human.

JHC: How communicative modes may evolve beyond the audiovisual, and with differential intensity and dynamics, is a pertinent question today. I also remember that in response to the question about how technologies might change the world, Chiang points to the new language of online video—inclusive of its assumed sense of authenticity—afforded by networked technologies as pivotal to contemporary societal transformation; a different transformative language will similarly emerge through new technologies. He has also beautifully explored this theme in his novella Story of Your Life, which was adapted into the film Arrival. Interests in moving beyond the ocular—and how we might "give body," as you say, to different kinds of complex, abstruse embodiment—have been growing widely and wildly. I have been approaching it as "sense making," as continuous/contiguous to making sense. There are many examples and approaches, though, such as Anicka Yi's olfactory practice (her latest was In Love with the World at the Tate Modern [8]) and Open Forest Collective's experimental inquiry into different forests and other-than-human dataflows [9], of which I have been a part. To me, crucial to doing these kinds of work is actively acknowledging the ways of knowing and being that have been neglected and displaced, which necessarily requires us to do our work with humility and reflexivity. "How" becomes very important. As Tyson Yunkaporta writes,

There is a pattern to the universe and everything in it, and there are knowledge systems and traditions that follow this pattern to maintain balance, to keep the temptations of narcissism in check. But recent traditions have emerged that break down creation systems like a virus, infecting complex patterns with artificial simplicity, exercising a civilizing control over what some see as chaos…. The war between good and evil is in reality an imposition of stupidity and simplicity over wisdom and complexity [10].

RV: Art has the power to approach these questions in experimental, creative ways that more traditional academic scholarship may not immediately embrace. But I think we also have to be careful about seeing it as existing on its own plane, divested from the realities of the systems we are all subject to. To do more than just "gazing from below," as artists working with technology, we need to be aware that we are fundamentally embedded within the power structures and practices of the industry, and seriously consider this in our practices. As Denise Ferreira da Silva [11] says, hacking a system is an extremely intentional act which first requires a thorough understanding of how things stand; that includes the ways we are positioned in relation to it. From where I am, artists have to take stock both of how we are surveilled and how we are complicit in the surveilling—which may make us uncomfortable, but ultimately is the only way we can imagine viable alternatives.

JHC: Agreed. It extends to other dimensions, too: the land we are on, materials and resources we use, processes in which we partake, and so much more, to do our research and practice, to see in different ways, and to live with many different worlds.

RV: Returning to Scott, not "seeing the forest for the trees" and instead leaning into the relationality of existence and understanding the core connections we have to other objects, people, spaces—how all of these things inform what we consider to be default in our lives.

back to top  References

1. https://fivethirtyeight.com/features/when-proof-is-not-enough/

2. Scott, J.C. Seeing Like a State: How Certain Schemes to Improve the Human Condition Have Failed. Yale Univ. Press, 1998.

3. https://kosmopolis.cccb.org/en/edicions/k21/ted-chiang-carme-torras-i-toni-pou/ (video of the talk no longer publicly available)

4. Choi, J.H-j., Forlano, L., and Kera, D. Situated automation: Algorithmic creatures in participatory design. Proc. of the 16th Participatory Design Conference 2020 - Participation(s) Otherwise - Volume 2. ACM, New York, 2020, 5–9; https://doi.org/10.1145/3384772.3385153

5. Zuboff, S. The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. PublicAffairs, 2019.

6. https://medium.com/processing-foundation/2017-marks-the-processing-foundations-sixth-year-participating-in-google-summer-of-code-d365f62fc463

7. Glissant, É. Poetics of Relation. B. Wing, trans. Univ. of Michigan Press, 1997.

8. https://www.tate.org.uk/whats-on/tate-modern/exhibition/hyundai-commission-anicka-yi

9. Botero Cabrera, A., Dolejšová, M., Choi, J.H-j., and Ampatzidou, C. Open forest: Walking with forests, stories, data, and other creatures. Interactions 29, 1 (Jan.–Feb. 2022), 48–53; https://doi.org/10.1145/3501766

10. Yunkaporta, T. Sand Talk: How Indigenous Thinking Can Save the World. Text Publishing, 2019.

11. Ferreira da Silva, D. Hacking the subject: Black feminism and refusal beyond the limits of critique. philoSOPHIA: A Journal of Continental Feminism 8, 1 (2018), 19–41.

back to top  Authors

Jaz Hee-jeong Choi is director of the Carefull Design Lab and Vice-Chancellor's Principal Research Fellow in design at RMIT University in Melbourne, Australia. [email protected]

Roopa Vasudevan is a media artist and scholar. Her work examines social and technological defaults; interrogates rules, conventions, and protocols that we often ignore or take for granted; and centers humanity and community in explorations of technology's impacts on society. She is currently a doctoral candidate at the Annenberg School for Communication at the University of Pennsylvania. [email protected]

back to top 

Copyright held by authors

The Digital Library is published by the Association for Computing Machinery. Copyright © 2022 ACM, Inc.

Post Comment


No Comments Found