Blogs

Entanglements


Authors: Christopher Frauenberger
Posted: Mon, April 27, 2020 - 5:27:42

As people around the world try to make sense of a new normal, many commentators are saying that this global pandemic will change us permanently. Indeed, the shifts in the ways that we run our societies are seismic and could hardly be imagined only a few months ago—social distancing, travel restrictions, and the shutdown of large parts of our economy, all implemented within a few weeks. Even if this may, and hopefully will, be only temporary, the experience of living through coronavirus will stay with us. We will have seen what is possible, both because it was necessary and because we chose to do so. Aspects of our existence that seemed to be set in stone will have turned out to be up for debate. The virus will have changed who we are.

Recently, I made an argument for a similar form of entanglement with the nonhuman world, that of technology. In [1], I argued that our intimate relationship with digital technology has become equally existential, in that the digital things we create fundamentally change who we are. Consequently, the key for guiding our creation of technological futures should be the political question of who we want to be as part of the world we happen to share with other things and beings—a holistic ethico-onto-epistemological perspective [2] that treats questions of being (ontology), knowledge creation (epistemology), and responsibility and purpose in the world (ethics) as inseparable from each other. While man-made technology is of course different from the virus in important ways, I find this relational, posthuman perspective also to be a very effective lens for making sense of our response to the pandemic, as it plays out in the context of technology as well as more generally.

As we find ourselves in a messy situation that is hard to assess, we debate what is the right or wrong response to this pandemic. In many ways, it is like a crash course on ethical dilemmas: Whose lives do we save—literally as well as in the sense of livelihoods—and at what cost? Who gets left behind? Who exerts that power and by what authority? And what will happen afterward? The main political arenas in this debate include public health, the economy, and technology, and we currently reconfigure these arenas by redefining some central relations between human and nonhuman actors [3]. This mattering [2], these discursive material practices, distribute agency and power in new ways. It makes certain things possible, while making other things very difficult, and it enacts new lines of differences and othering. While there are certainly many different ways in which this plays out, I want to pick out two examples that struck me as particularly relevant for the field of HCI.

Like in many other countries, schools were officially closed in Austria in mid-March. The teachers of my two children scrambled to find ways to implement “eLearning” in a matter of days. They sent PDF worksheets as large attachments to emails with 30-odd recipients, answered questions in WhatsApp groups, distributed links to online content all over the Internet, and organized the occasional video conference, asking “So, how is everybody?” Like all of us, they have been caught by surprise and find themselves on steep learning curves. Some parents are distraught—they share one laptop between three children and need to do their home office work at night, when none of them is eLearning. Others do not have access to a printer, or are running low on toner or paper. Some children have taken on the roles of translators and IT consultants for their parents, while self-organizing their own education—knowledge and skills that they will not be credited for. Communication with some children has just dropped out entirely. 

There has been a lot of work on the (new) digital divide [4,5], but the virus has laid it bare in the midst of our society. Further, in our response to the pandemic we witness firsthand a reconfiguration of that divide, an implicit (and explicit) othering, facilitated through technology. Hard-to-reach children from difficult socioeconomic backgrounds, who we should have the highest interest in lifting up, have just been dealt a(nother) bad hand. In the coming surge of efforts to design roles of technology in education, as no doubt will happen, we will need to negotiate these entanglements between improving learning experiences and creating equal opportunities. There will be no best decisions, only choices and trade-offs in the political arena that is innovation. And, I argue, one of the most productive questions that can guide this innovation is: Who do we want to become through the (educational) tools we bring into this world? 

The second example is, at its core, a struggle as old as humanity: between the common good and individual freedom. And, as one would expect, it implicates digital technology at its center. Surveillance capitalism has co-evolved with technology to produce an infrastructure that runs on unprecedented levels of knowledge about the masses, with mechanisms for behavior prediction and manipulation at scale [6]. Now this infrastructure can potentially inform our response to the pandemic. In Austria, the former state-owned telecommunications provider produced aggregated data about people’s mobility for the government, after their first lockdown measures. While the public largely supported the lockdown, this use of information was perceived as suspicious. Of course, Google provides a similar analysis for all the countries of the world and uses movement data to chart the least busy times to shop in supermarkets. Helpful now, no doubt, but also a simple repurposing of information that the company collected for different reasons. It is interesting to note that, at least in Austria, the public seems to be wary of the state using that information, while private companies seem to seize the opportunity for whitewashing their practices.

A related function is contact tracing. In Austria, the Red Cross teamed up with Accenture and a private insurer to produce an app, Stopp Corona, that uses Bluetooth and ultrasound to estimate the distance between two mobile phones, performing a digital handshake if they are close enough. IDs are exchanged, and if someone tests positive, the logs allow the tracing of contacts to contain the spreading. The Austrian data-rights NGO epicenter.works analyzed the app and came to the conclusion that it does many things right—it was developed with privacy and security in mind. But no independent audits have yet been conducted and questions around the involvement of private companies are being asked. There are also fears that, while officially denied, use of the app will become quasi-compulsory to be able to participate in the slow reopening of public life. Meanwhile, a broad European alliance of research institutes and technology providers have teamed up in the Pan-European, Privacy-Preserving Proximity Tracing (PEPP-PT) project. And, in an unusual alliance, Google and Apple are collaborating to build contact tracing into their mobile operating systems in similar ways.

Next to medical testing, such data may become indispensable for making informed decisions around the far-reaching changes in our societies. The nature of the sociotechnical infrastructure that we build to produce this data determines what we can know, who it will discriminate against, and what we become through it—again, an ethico-onto-epistemological question [2]. This entanglement is being acknowledged, as we witness a new quality of debate that recognizes that technology is deeply political. While the paradigm of surveillance capitalism has rampaged through our societies largely unchecked, with far-reaching consequences for our democratic structures, questions about whose interests are being served with tracing apps and how this is reconfiguring power are starting to be asked. As in the educational context above, there will be no objectively “correct” design decisions in using big data for keeping pandemics at bay. And as with the notorious trolley problem, there is no correct answer to how much privacy we may want to give up for saving how many lives. These will be choices that, I argue, need to be negotiated. We also need to find appropriate formats for people to participate in this process of agonistic struggle for desirable (technological) futures [7]. And we should be guided by the question of what the technology we bring into this world will make us and if this is who we want to become.

It may well be that public health becomes the next national security—an inherently elusive, yet indisputable desire of people that is being misused to justify technological surveillance. Like 9/11, we might see the coronavirus serving as the scapegoat to implement modes of mass behavior manipulation by private companies. However, current public discourse offers glimpses of hope that society might have come to realize something in this pandemic: that digital technology is not just a tool; that innovation is a political arena in which we can participate; that technology creators are political actors who cannot be allowed to be above democratic accountability; and that we can have a voice in shaping technological futures—as they shape who we become through them. In a very posthuman, relational way, the virus may have shifted our relationship with technology.

Endnotes

1. Frauenberger, C. Entanglement HCI the next wave? ACM Trans. Comput.- Hum. Interact. 27, 1 (2019), 2:1–2:27. DOI: 10.1145/3364998

2. Barad, K. Meeting the Universe Halfway: Quantum Physics and the Entanglement of Matter and Meaning. Second Printing edition. Duke Univ. Press Books, Durham, NC, 2007.

3. Latour, B. Reassembling the Social: An Introduction to Actor-Network-Theory. Clarendon Lectures in Management Studies. Oxford Univ. Press, Oxford, UK, 2005. 

4. Warschauer, M. Technology and Social Inclusion: Rethinking the Digital Divide. MIT Press, 2004. 

5. Brandtzæg, P.B., Heim, J., and Karahasanovic’, A. Understanding the new digital divide—A typology of Internet users in Europe. International Journal of Human-Computer Studies 69, 3 (2011), 123–138. DOI: 10.1016/j.ijhcs.2010.11.004

6. Zuboff, S. The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. 1 edition. PublicAffairs, New York, 2019.

7. Mouffe, C. Agonistics: Thinking the World Politically. Verso, 2013.



Posted in: Covid-19 on Mon, April 27, 2020 - 5:27:42

Christopher Frauenberger

Christopher Frauenberger is a senior researcher at the Human-Computer Interaction Group, TU Wien (Vienna University of Technology). His research focuses on designing technology with and for marginalized user groups, such as those with disabilities. He is committed to participatory design approaches and builds on theories and methods from diverse fields such as the action research, disability studies, philosophy of science, and research ethics. [email protected]
View All Christopher Frauenberger's Posts



Post Comment


No Comments Found