Just futures

XXVII.4 July - August 2020
Page: 74
Digital Citation


Christopher Frauenberger

back to top 

Many are saying that this global pandemic will change us permanently. Indeed, the shifts in the ways that we run our societies are seismic and could hardly be imagined only a few months ago: social distancing, travel restrictions, and the shutdown of large parts of our economy, all implemented within a few weeks. Even if this may, and hopefully will, be only temporary, the experience of living through coronavirus will stay with us. We will have seen what is possible, both because it was necessary and because we chose to take certain actions. Aspects of our existence that seemed to be set in stone have turned out to be up for debate. The virus will have changed who we are.

Recently, I made an argument for a similar form of entanglement with the nonhuman world, that of technology. In [1] I argued that our intimate relationship with digital technology has become equally existential, in that the digital things we create fundamentally change who we are. Consequently, the key for guiding our creation of technological futures should be the political question of who we want to be as part of the world we happen to share with other things and beings—a holistic ethico-on-to-epistemological perspective [2] that treats questions of being (ontology), knowledge creation (epistemology), and responsibility and purpose in the world (ethics) as inseparable from each other. While man-made technology is of course different from the virus in important ways, I find this relational, posthuman perspective also to be a very effective lens for making sense of our response to the pandemic, as it plays out in the context of technology as well as more generally.

As we find ourselves in a messy situation that is hard to assess, we debate what is the right or wrong response. It's not unlike a crash course on ethical dilemmas: Whose lives do we save—literally and in the sense of livelihoods—and at what cost? Who gets left behind? Who exerts that power and by what authority? And what will happen afterward? The main political arenas in this debate include public health, the economy, and technology, and we currently reconfigure these arenas by redefining some central relations between humans and nonhuman actors [3]. This mattering [2], these discursive material practices, distribute agency and power in new ways. It makes certain things possible while making other things very difficult, and it enacts new lines of differences and othering. While there are certainly many different ways in which this mattering plays out, I want to pick out two examples that struck me as particularly relevant for the field of HCI.

Like in many other countries, schools were officially closed in Austria in mid-March. The teachers of my two children scrambled to find ways to implement "eLearning" in a matter of days. They sent PDF worksheets as large attachments to emails with 30-odd recipients, answered questions in WhatsApp groups, distributed links to online content all over the Internet, and organized the occasional video conference, asking, "So, how is everybody?" Like all of us, they have been caught by surprise and found themselves on a steep learning curve. Some parents are distraught. They share one laptop between three children and need to do their home office work at night, when none of them is eLearning. Others do not have access to a printer or are running low on toner or paper. Some children have taken on the roles of translators and IT consultants for their parents, while self-organizing their own education—knowledge and skills for which they will not be credited. Communication with some children has dropped off completely.

Who do we want to become through the tools we bring into this world?

There has been a lot of work on the (new) digital divide [4,5], but the virus has laid it bare in the middle of our society. Further, in our response to the pandemic we witness firsthand a reconfiguration of that divide, an implicit (and explicit) othering, facilitated through technology. Hard-to-reach children from difficult socioeconomic backgrounds, whom we should have the highest interest in lifting up, have just been dealt a(nother) bad hand. In the coming surge of efforts to design new roles for technology in education, as no doubt will happen, we will need to negotiate these entanglements between improving learning experiences and creating equal opportunities. There will be no best decisions, only choices and tradeoffs in the political arena that is innovation. And, I argue, one of the most productive questions that can guide this innovation is: Who do we want to become through the (educational) tools we bring into this world?

The second example is, at its core, a struggle as old as humanity: between common good and individual freedom. And, as one would expect, it implicates digital technology at its center. Surveillance capitalism has co-evolved with technology to produce an infrastructure that runs on unprecedented levels of knowledge about the masses, with effective mechanisms for behavior prediction and manipulation at scale [6]. Now this infrastructure can potentially inform our response to the pandemic. In Austria, the former state-owned telecommunications provider produced aggregated data about people's mobility for the government, after their first lockdown measures. While the public largely supported the lock-down, this use of information was perceived with suspicion. Of course, Google provides a similar analysis for all the countries of the world (www.google.com/covid19/mobility/) and uses movement data to chart the least busy times to shop in supermarkets. Helpful now, no doubt, but also a simple repurposing of information that the company collected for different reasons. It is interesting to note that, at least in Austria, the public seems to be wary of the state using that information, while we expect no less from private companies, who seem to seize the opportunity for whitewashing their practices.

A related function is contact tracing. In Austria, the Red Cross teamed up with Accenture and a private insurer to produce StoppCorona (participate.roteskreuz.at/stopp-corona/), an app that uses Bluetooth to estimate the distance between two mobile phones and performs a digital handshake if they are close enough. IDs are exchanged, and if someone tests positive, the logs allow the tracing of contacts to contain the spread. The Austrian data-rights NGO epicentre. works has analyzed the app and come to conclusion that it does many things right—it was developed with privacy and security in mind. But no independent audits have yet been conducted, and questions around the involvement of private companies with vested interests are being asked. There are also fears that, while officially denied, it will become quasi-compulsory to have such an app installed to participate in the reopening of public life. While various projects rush to develop solutions that respect users' privacy, such as PEPP-PT (www.pepp-pt.org/), DP-3T (github.com/DP-3T/documents), or similar efforts by Google and Apple [7], the question is whether privacy guarantees are enough to make us want to use an app that has the power to (de-)legitimize us taking part in society?


Tracing data, next to medical testing, may become indispensable for making informed decisions around public health and the wide-ranging changes in our societies. However, the nature of the sociotechnical infrastructure that we build to produce this data determines what we can know, who it will discriminate against, and what we become through it—again, an ethico-onto-epistemological question [2]. This entanglement is slowly being acknowledged, as we witness a new quality of debate that recognizes that technology is deeply political. While the paradigm of surveillance capitalism has rampaged through our societies largely unchecked, with far-reaching consequences for our democratic structures, questions about whose interests are being served with tracing apps and how this is reconfiguring power are starting to gain traction. As in the educational context above, there will be no objectively correct design decisions in using big data for keeping pandemics at bay. And as with the notorious trolley problem, there is no right answer to the question of how much privacy we may want to give up in exchange for saving a certain number of lives. These will be situated choices that, I argue, need to be negotiated. We also need to find appropriate formats for people to participate in this process of agonistic struggle for desirable (technological) futures [8]. Again, we should be guided by the question of what the technology we bring into this world will make us and if this is who we want to become.

It may well be that public health becomes the next national security—an inherently elusive yet indisputable desire of people that is being misused to justify technological surveillance. As we did around 9/11, we might see the coronavirus serving as the scapegoat to implement modes of mass behavior manipulation by private companies. However, current public discourse offers glimpses of hope that society might have come to realize certain things in this pandemic: Digital technology is not just a tool. Innovation is a political arena in which we can participate. Technology creators are political actors who cannot be allowed to be above democratic accountability. And we can have a voice in shaping technological futures—as they shape who we become through them. In a very posthuman, relational way, the virus may have shifted our relationship with technology.

back to top  References

1. Frauenberger, C. Entanglement HCI the next wave? ACM Trans. Comput.-Hum. Interact. 27, 1 (2019), 2:1–2:27. DOI: 10.1145/3364998

2. Barad, K. Meeting the Universe Halfway: Quantum Physics and the Entanglement of Matter and Meaning. Second Printing edition. Duke Univ. Press Books, Durham, NC, 2007.

3. Latour, B. Reassembling the Social: An Introduction to Actor-Network-Theory. Clarendon Lectures in Management Studies. Oxford Univ. Press, Oxford, UK, 2005.

4. Warschauer, M. Technology and Social Inclusion: Rethinking the Digital Divide. MIT Press, 2004.

5. Brandtzæg, P.B., Heim, J., and Karahasanovic', A. Understanding the new digital divide—A typology of Internet users in Europe. International Journal of Human-Computer Studies 69, 3 (2011), 123–138. DOI: 10.1016/j.ijhcs.2010.11.004

6. Zuboff, S. The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power, 1 edition. PublicAffairs, New York, 2019.

7. Greenberg, A. How Apple and Google are enabling Covid-19 contact-tracing. Apr. 10, 2020; https://www.wired.com/story/apple-google-bluetooth-contact-tracing-covid-19/

8. Mouffe, C. Agonistics: Thinking the World Politically. Verso, 2013.

back to top  Author

Christopher Frauenberger is a senior researcher at the Human-Computer Interaction Group, TU Wien (Vienna University of Technology). His research focuses on designing technology with and for marginalized user groups, such as those with disabilities. He is committed to participatory design approaches and builds on theories and methods from diverse fields such as the action research, disability studies, philosophy of science, and research ethics. [email protected]

back to top 

Copyright held by author. Publication rights licensed to ACM.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2020 ACM, Inc.

Post Comment

No Comments Found