Blogs

The perils of next-gen surveillance technology


Authors: Juan Hourcade
Posted: Fri, February 22, 2019 - 3:27:20

Low-cost, high-performance, data capture, storage, and processing capabilities are changing our world. We are in the era of big data. Most large organizations are storing large amounts of data about every aspect of their operations, just in case they need it at some point, turning it over to data scientists hoping to gain insights.

This data revolution has the potential for positive outcomes. For example, it can help model important phenomena such as climate change. In medicine, it can help prevent the spread of infectious disease more efficiently and effectively.

However, in spaces where information about people has few if any legal protections, these developments have the potential of having grave consequences. A particular emerging phenomenon is the enormous imbalance between individuals and powerful organizations in terms of access to data and the ability to analyze and act on it. This kind of information inequality irrevocably leads to power imbalances, with a high likelihood of negative effects on individual freedoms and democratic values, in particular for the most vulnerable.

There is an increasingly broad awareness about large companies that gather vast amounts of information about people in exchange for free services, thanks in part to Shoshana Zuboff’s recent book Surveillance Capitalism. My concern in this post is with research on surveillance tools and methods that could make the situation even worse by tracking every little thing we do without proper consent (i.e., full awareness of the extent of surveillance, how it is used, and having a meaningful choice to not participate) and most often with little or no benefit to those being surveilled.

Such surveillance research tends to carry a concerning way of thinking. Its first component reminds me of early 20th-century Italian Futurism, with its emphasis on speed, aggressive action, and risk-taking. In this incarnation, it’s a thirst for data about people and quick action to build highly invasive tools without much concern for how they may be repurposed by organizations, without an ethics board. Italian Futurism glorified the cleansing power of war and rejected the weak. The surveillance tech that worries me aims to identify people that do not fit a particular norm, potentially leading to cleansing or manipulation through data.

A second characteristic is a highly paternalistic approach. The idea is that the organization knows better than the people with whom it interacts and therefore has the right and moral authority to conduct surveillance.

A third characteristic is an Ayn Rand–like approach to ethics, which assumes that the self-interest of the organization leads to societal benefit.

Why am I concerned? For employees, in particular those who are in professions and trades where they can easily be replaced, surveillance technologies could prove and are already proving to be quite damaging. For example, they could help companies understand how to squeeze the most work for the least pay, fire overworked employees as they burn out, obtain data to automate tasks and eliminate jobs, use social network analysis to eliminate possibilities of union organizing, and lay off people who do not fit (most likely those from vulnerable groups). For people in a customer relationship with a company, the consequences may not be as severe but may include customized efforts to keep them from leaving, whether it is in their best interest or not, and appeals to their passions and emotions in order to manipulate their behavior.

The greatest concern I have is how these technologies could be used and are already being used by police states where there are few legal guarantees for civil liberties. In such cases, these technologies could be used to track dissidents, listening in on every conversation, knowing about every personal connection, every purchase, every move, every heartbeat and breath, and lead to a variety of punishments for people who do not fit the norm and those connected to them.

So what should we do? We need to redouble our commitment to values that have traditionally been at the core of our discipline. First, when working on tools or methods that involve data about large numbers of people, we need to get off the approach of aggressive, risky, speedy action in order to grow fast/get a grant/beat the competition, no matter what breaks. Instead, let’s be guided by societal needs and think about the possible consequences of our research. How do we do this? We put those who are likely to be affected by technology at the center of our processes. We can then also avoid paternalistic approaches and self-serving ethical frameworks.

I feel that part of our community has slowly been abandoning our early commitment to putting those affected by the technologies we design at the center of our design processes. I hope this blog post works as a wake up call to colleagues, but also to our community to re-engage with these values and do our part to steer technology toward befitting society at large and away from augmenting existing power imbalances that are likely to damage individual freedoms and civil rights.


Posted in: on Fri, February 22, 2019 - 3:27:20

Juan Hourcade

Juan Pablo Hourcade is an associate professor in the Department of Computer Science at the University of Iowa, focusing on human-computer interaction.
View All Juan Hourcade's Posts



Post Comment


No Comments Found