Jaz Hee-jeong Choi, Roopa Vasudevan
We met in Aarhus, Denmark, in the summer of 2019. Considering where we live physically—Roopa in Lenapehoking (Philadelphia) and Jaz in the Kulin Nation (Melbourne)—it was an odd place for us to meet. It was an odd time, too. In hindsight, it was like how we remember our last teenage summer, when everything was changing fast but the familiar patterns of our everyday lives kept us naive and carefree, or ignorant and cruel. Not long after we met, border and mobility control tightened across the world. We have connected twice since then from afar, via two pieces of writing: first, over Mimi Onuoha's "When Proof Is Not Enough" , which pointedly showed how data as evidence can be useful in confirming what we perceive to be true but does little to shift perceptions, much less fix systemic injustices. And second, over James C. Scott's Seeing Like a State: How Certain Schemes to Improve the Human Condition Have Failed , which we both reencountered after several years. The third time, lucky for us, is our having a dialogue in cowriting this column. We draw on Ted Chiang's words as we think with and through Roopa's art practice:
I don't think computers are even remotely close to human intelligence. We are decades too far in the future. You think about the number of operations that a computer can do per second, but that has nothing to do with human intelligence…. We are in the amoeba phase .
Jaz Hee-jeong Choi: It's a humorous image: so many humans being terrified by a big, bad amoeba. But, as Chiang notes in a different interview, the amoeba reference here is about technical advancement. While many portray AI as either heroic or tyrannical, what's actually terrifying are the "big, bad" capitalistic agendas feeding and shaping (amoebic) "algorithmic creatures"  and their renderings, involving strategies of surveillance and control. Roopa, your work has explored similar themes. Could you share a little more about your practice?
Roopa Vasudevan: In much of my work, I have focused on creating space for the consideration of voice, agency, and embodiment within the digital. Technology allows for—maybe even demands—multiple renderings of the self, and I am interested in probing those to find discrepancies or dissonances. What is lost when humanity is translated back and forth between the physical and the digital, when identity is abstracted into quantifiable portions that can be packaged and marketed within what Shoshana Zuboff calls surveillance capitalism ? What are the ineffable qualities of being human that get lost in this process? How does the surveillance that we are subjected to, day in and day out, skew the ways in which we are perceived—maybe even the ways we see ourselves?
With dataDouble (Figure 1; https://datadouble.art), I attempted to shed light on processes that are opaque and mysterious but have practical implications for how people experience the digital world. I created a Web extension that uses browsing data to alter and shape a user's photograph into a portrait of their "data double." To me, the most important part of this work was the opportunity it provided for people to view a representation of how dataveillance filters them through specific lenses, and to talk back to that—to have a chance to express what they feel is missing, or what scares them, or what doesn't bother them at all. Dataveillance is predicated on stripping humanity and complexity out of individuality and reducing it to monetary value, and doing so in a way that abstracts or hides its true effects. Putting a literal face on that—your literal face—brings those effects home a little and offers a way to return a human voice to an otherwise dehumanizing process.
|Figure 1. Roopa Vasudevan, dataDouble (partial install view). Vox Populi, Philadelphia, PA. June/July 2021.|
JHC: It's reminiscent of how scientific forestry, as described by Scott to illustrate how a high-modernist narrowing of vision made specific bits of the forest legible to serve the economic imperatives of specific people in power, failed humans (in fact, more than humans)—only in bits and bytes, and only at previously unimagined scales.
What is lost when humanity is translated back and forth between the physical and the digital?
RV: Yes, exactly. I am also interested in examining what reads as legible within both digital and analog contexts, and how the power structures inherent in big tech warp what we perceive as acceptable or unacceptable. Two of my projects approach this from very different perspectives: Syntax Error (Figure 2; https://machinereadable.art/syntax-error/) from the perspective of programming error messages, and Big Tech Says Sorry (Figure 3; https://roopavasudevan.com/artwork/#big-tech-says-sorry) through affect. In both cases, technological systems—really, the people responsible for the most dominant and powerful of these—drive the ways in which we approach behaving, and also shape our acceptance of what is legitimate or "natural." The esoteric language of the error message may feel annoying or impenetrable, but we don't really question how it is written or think about who it might be written for (although there have been recent efforts  to change that). At the same time, when sincere, personal, authentic apologies for an "error" (of varying magnitude) appear to come from large tech corporations, the language suddenly feels discomfiting and fake; a manufactured PR apology is expected, rather than what we might actually want: genuine accountability.
|Figure 2. Roopa Vasudevan, Syntax Error (install view). Vox Populi, Philadelphia, PA. June/July 2021.|
|Figure 3. Roopa Vasudevan, Big Tech Says Sorry (partial install view). Vox Populi, Philadelphia, PA. June/July 2021.|
I have lately found that integrating my personal experiences within these works has been an extremely generative way of pointing at these dissonances—in Syntax Error, I start from error messages that I constantly receive in my creative practice, and Big Tech Says Sorry emerged from an extended investigation into my own style of apologizing to those I feel I've wronged over the years. Like in dataDouble, I find possibility with the juxtaposition of my own tangible presence—both as an artist and as a human being living under the watchful gaze of Google, Apple, and Amazon—with the more impersonal filter of technological systems and the companies that make and perpetuate them. To me, placing those things together allows us to surface what has faded into the woodwork—how we have learned to become more "transparent" to the machine (in Glissant's  conception of the word) at the expense of the messiness and complexity that, in a way, defines being human.
JHC: How communicative modes may evolve beyond the audiovisual, and with differential intensity and dynamics, is a pertinent question today. I also remember that in response to the question about how technologies might change the world, Chiang points to the new language of online video—inclusive of its assumed sense of authenticity—afforded by networked technologies as pivotal to contemporary societal transformation; a different transformative language will similarly emerge through new technologies. He has also beautifully explored this theme in his novella Story of Your Life, which was adapted into the film Arrival. Interests in moving beyond the ocular—and how we might "give body," as you say, to different kinds of complex, abstruse embodiment—have been growing widely and wildly. I have been approaching it as "sense making," as continuous/contiguous to making sense. There are many examples and approaches, though, such as Anicka Yi's olfactory practice (her latest was In Love with the World at the Tate Modern ) and Open Forest Collective's experimental inquiry into different forests and other-than-human dataflows , of which I have been a part. To me, crucial to doing these kinds of work is actively acknowledging the ways of knowing and being that have been neglected and displaced, which necessarily requires us to do our work with humility and reflexivity. "How" becomes very important. As Tyson Yunkaporta writes,
There is a pattern to the universe and everything in it, and there are knowledge systems and traditions that follow this pattern to maintain balance, to keep the temptations of narcissism in check. But recent traditions have emerged that break down creation systems like a virus, infecting complex patterns with artificial simplicity, exercising a civilizing control over what some see as chaos…. The war between good and evil is in reality an imposition of stupidity and simplicity over wisdom and complexity .
RV: Art has the power to approach these questions in experimental, creative ways that more traditional academic scholarship may not immediately embrace. But I think we also have to be careful about seeing it as existing on its own plane, divested from the realities of the systems we are all subject to. To do more than just "gazing from below," as artists working with technology, we need to be aware that we are fundamentally embedded within the power structures and practices of the industry, and seriously consider this in our practices. As Denise Ferreira da Silva  says, hacking a system is an extremely intentional act which first requires a thorough understanding of how things stand; that includes the ways we are positioned in relation to it. From where I am, artists have to take stock both of how we are surveilled and how we are complicit in the surveilling—which may make us uncomfortable, but ultimately is the only way we can imagine viable alternatives.
JHC: Agreed. It extends to other dimensions, too: the land we are on, materials and resources we use, processes in which we partake, and so much more, to do our research and practice, to see in different ways, and to live with many different worlds.
RV: Returning to Scott, not "seeing the forest for the trees" and instead leaning into the relationality of existence and understanding the core connections we have to other objects, people, spaces—how all of these things inform what we consider to be default in our lives.
3. https://kosmopolis.cccb.org/en/edicions/k21/ted-chiang-carme-torras-i-toni-pou/ (video of the talk no longer publicly available)
4. Choi, J.H-j., Forlano, L., and Kera, D. Situated automation: Algorithmic creatures in participatory design. Proc. of the 16th Participatory Design Conference 2020 - Participation(s) Otherwise - Volume 2. ACM, New York, 2020, 5–9; https://doi.org/10.1145/3384772.3385153
9. Botero Cabrera, A., Dolejšová, M., Choi, J.H-j., and Ampatzidou, C. Open forest: Walking with forests, stories, data, and other creatures. Interactions 29, 1 (Jan.–Feb. 2022), 48–53; https://doi.org/10.1145/3501766
Jaz Hee-jeong Choi is director of the Carefull Design Lab and Vice-Chancellor's Principal Research Fellow in design at RMIT University in Melbourne, Australia. email@example.com
Roopa Vasudevan is a media artist and scholar. Her work examines social and technological defaults; interrogates rules, conventions, and protocols that we often ignore or take for granted; and centers humanity and community in explorations of technology's impacts on society. She is currently a doctoral candidate at the Annenberg School for Communication at the University of Pennsylvania. firstname.lastname@example.org
Copyright held by authors
The Digital Library is published by the Association for Computing Machinery. Copyright © 2022 ACM, Inc.