Blog@IX

XXVI.4 July-August 2019
Page: 6
Digital Citation

The perils of next-gen surveillance technology


Authors:
Juan Hourcade

back to top 

Low-cost, high-performance, data-capture, storage, and processing capabilities are changing our world. We are in the era of big data. Most large organizations are storing large amounts of data about every aspect of their operations, just in case they need it at some point, and turning it over to data scientists hoping to gain insights.

This data revolution has the potential for immensely positive outcomes. For example, it can help model important phenomena such as climate change. In medicine, it can help prevent the spread of infectious disease more efficiently and effectively.

However, in spaces where information about people has few if any legal protections, these developments also have potentially grave consequences. An emerging phenomenon is the enormous imbalance between individuals and powerful organizations in terms of access to data and the ability to analyze and act on it. This kind of information inequality irrevocably leads to power imbalances, with a high likelihood of negative effects on individual freedoms and democratic values, in particular for the most vulnerable.

There is a growing awareness of large companies that gather vast amounts of information about people in exchange for free services, thanks in part to Shoshana Zuboff's recent book Surveillance Capitalism. My concern in this post is with research on surveillance tools and methods that could make the situation even worse by tracking every little thing we do without proper consent (i.e., full awareness of the extent of surveillance, how it is used, and having a meaningful choice to not participate) and most often with little or no benefit to those being surveilled.

Such surveillance research tends to carry with it a way of thinking that is concerning. Its first component reminds me of early 20th-century Italian Futurism, with its emphasis on speed, aggressive action, and risk-taking. In this incarnation, it's a thirst for data about people and quick action to build highly invasive tools without much concern for how they may be repurposed by organizations without an ethics board. Italian Futurism glorified the cleansing power of war and rejected the weak. The surveillance tech that worries me aims to identify people who do not fit a particular norm, potentially leading to cleansing or manipulation through data.

A second characteristic is a highly paternalistic approach. The idea is that the organization knows better than the people with whom it interacts and therefore has the right and moral authority to conduct surveillance.

A third characteristic is an Ayn Rand–esque approach to ethics, which assumes that the self-interest of the organization leads to societal benefit.

Why am I concerned? For employees, in particular those who are in professions and trades where they can easily be replaced, surveillance technologies could prove to be quite damaging. For example, they could help companies understand how to squeeze the most work for the least pay, fire overworked employees as they burn out, obtain data to automate tasks and eliminate jobs, use social network analysis to eliminate possibilities of union organizing, and lay off people who do not fit the norm (most likely those from vulnerable groups). For people in a customer relationship with a company, the consequences may not be as severe, but could include customized efforts to keep them from leaving, whether that is in their best interest or not, and appeals to their passions and emotions in order to manipulate their behavior.


The greatest concern I have is how these technologies could be used by police states.


The greatest concern I have is how these technologies could be used and are already being used by police states, where there are few legal guarantees for civil liberties. In such cases, these technologies could help in tracking dissidents, listening in on every conversation, knowing about every personal connection, every purchase, every move, every heartbeat and breath, and lead to a variety of punishments for people who do not fit the norm, and those connected to them.

ins01.gif

So what should we do? We need to redouble our commitment to values that have traditionally been at the core of our discipline. First, when working on tools or methods that involve data about large numbers of people, we need to get off the approach of aggressive, risky, speedy action in order to grow fast/get a grant/beat the competition, no matter what breaks. Instead, let's be guided by societal needs and think about the possible consequences of our research. How do we do this? We put those who are likely to be affected by technology at the center of our processes. We can then also avoid paternalistic approaches and self-serving ethical frameworks.

I feel that part of our community has slowly been abandoning our early commitment to putting those affected by the technologies we design at the center of our design processes. I hope this blog post works as a wake-up call to colleagues, but also to our community, to reengage with these values. We must do our part to steer technology toward benefiting society at large and away from augmenting existing power imbalances that are likely to damage individual freedoms and civil rights.

back to top  Author

Juan Pablo Hourcade is an associate professor in the Department of Computer Science at the University of Iowa, focusing on human-computer interaction. [email protected]

back to top 

Copyright held by author

The Digital Library is published by the Association for Computing Machinery. Copyright © 2019 ACM, Inc.

Post Comment


No Comments Found