XXIX.2 March - April 2022
Page: 24
Digital Citation

Resetting the expectation of surveillance

Jonathan Bean

back to top 

In an absentminded moment the other day, I zoomed in on Google Maps to look at the parking lot of the Ikea store in Phoenix, about two hours from my home in southern Arizona. Living in the midst of a pandemic in one of the states with the most Covid cases and the fewest restrictions means that risk calculation has become a part of the most mundane parts of everyday life, from going to the grocery store to getting coffee. In my own effort to manage risk and reduce anxiety, I've found solace in the "Popular Times" function on Google Maps, which uses real-time data to present a histogram that predicts how busy a place will be.

But the histogram is relative, so it is most useful when you've experienced firsthand what it means for a place to be busy. The local grocery store, for example, offers a 10 percent discount for students once a week and, without fail, it gets extra crazy every Tuesday around 5:30 p.m., when everyone else stopping in on their way home to pick up a few things for dinner converge with the crowd of deal-hungry students. The histogram tops out at exactly that time, and because it's a place where I shop regularly, I have a good sense of what it's like inside the store when the bar on the histogram is at 50 percent.

While I've moved around enough to log hard time in other Ikea stores, I've spent little time in the Phoenix one. When I zoomed in on the map, I was looking for more contextual information: OK, I thought, if it will be about 75 percent busy by the time I get there, what will that look like if there are this many cars in the parking lot and it's 50 percent busy now?

I had a good laugh at my own expense—how silly to assume the Google Maps image was real time!—but then quickly realized this assumption was not as preposterous as I originally thought. Even in the absence of real-time aerial imagery, it would be possible, even relatively easy, to create a composite or simulated image showing a full parking lot at busy times, or erase cars from the image at times when the store was closed. This could have some real benefits for increasing public awareness of the impacts of auto-intensive infrastructure. Our parking lots, like our roads, are built to handle peak periods of demand that typically occur for a couple of hours on a few days each year. The majority of the time, most roads and parking lots sit empty. Being able to visualize this over time could help people see more clearly the amount of space and resources dedicated to the convenience of drivers. This would represent something of an inversion of the surveillance lens—turning our own data back on us so that we can more clearly see ourselves.

Since my last column, which dealt with the topic of eroding trust in technology and public institutions, I've been seeing more clearly how the ubiquity of technology has altered our expectations, behaviors, and even our emotional states. After some reflection, I was unsettled by the automaticity of the instinct I had to zoom into the image of the parking lot. What does it mean that some part of my brain thought, Well, of course I can get a live overhead view of a store two hours away, as though I'm living in some sort of spy thriller where overhead satellite views and CCTV feeds are only a couple of keystrokes away? We—or at least I—are becoming accustomed to being watched. Even the few people I know who were conscientious objectors to our smartphone state have yielded their aging flip phones for a modest Android or iPhone, an act that smacks more of capitulation than defeat. Our collective actions have sealed the bargain: Convenience trumps privacy. Surveillance may not seem desirable, but it does seem inevitable: What other choice do we have?

This shared expectation of surveillance intersects with design in complicated ways. Foucault famously used the structure of the panopticon—a prison structure where the guard would have unfettered visual access to every prisoner's cell—as a metaphor to explain how power and surveillance are used to subjugate. Some question the transposition of this metaphor into our contemporary sociotechnical world by claiming the difference is that, unlike the prisoners, we don't know we are being watched [1]. But it's difficult to imagine this is the case today. As a column I wrote earlier in the pandemic explored, the shift to living and learning at home put on full display both the infrastructure and the ideology embedded in our social institutions. In the U.S., some have gone so far as to account for the emergence of protests over critical race theory as being rooted in the passive surveillance created when schools shifted to at-home education. Parents overheard teachers talking about history in unfamiliar ways and organized around what they saw as a threat to their children [2].

In the several years since hackers famously gained access to a casino via the Internet-connected equipment monitoring the water quality in a fish tank, an Internet-connected TV or TV add-on device has become standard issue in American homes, with about 70 percent of consumers across all age categories owning one. Smart speakers aren't far behind, pushing a 50 percent adoption rate. And around one in five households has Internet-connected devices in each of the following categories: lighting, locks, security cameras, and other gadgets [3]. It would be logical to conclude that, for many of us, whatever concerns we may have about security and privacy are outweighed by the convenience of all this connected stuff.

But that way of thinking is very much rooted in a conceptualization of the consumer as a rational economic actor. We know most people don't act like that; if we did, nearly everyone would drive the same car or choose the same model of refrigerator. People buy things for lots of reasons—very few of them rational. Sure, some things make life more convenient. But in the context of contemporary life, convenient usually means time-saving, or, more accurately, time-shifting. The robot vacuum lets you spend less time cleaning so you can spend more time gaming or cooking or working or whatever, blithely untroubled by the fact that it may or may not be uploading somewhere to the cloud a geotagged floor plan of your house. And other purchases support and reinforce our identities: Are you a PC or a Mac—or too cool for either? From this perspective, what's fueling adoption of all this connected stuff is a desire to live life in the technological present to the fullest. Privacy is a small price to pay for being seen by yourself and others as current and relevant.

Getting back to the expectation of surveillance: So much of our technological stuff doesn't really present us with a choice. Set up a new computer, load up a phone with apps, turn on that robot vacuum, or hop in the car, and the chances are pretty good that something, somewhere, is collecting data. Is this surveillance? The word, with roots in French and Latin, means to watch over, in the visual sense. Access to the private visual realm clearly crosses the line: Witness the emergence of the practice of taping over or physically disabling laptop webcams. In contrast, the streams of data we generate through our everyday use of technology, from smartphones to thermostats to light bulbs, are largely invisible. Is this why many of us are willing to overlook the surveillance we are setting up of our own accord? At any rate, the terms of service for Internet-connected software and hardware frequently presents a binary choice: Either enable data "sharing" in the name of ease of use, or don't use the product. In cultures where we are all expected to be connected, this isn't much of a choice at all.


Anthropologists Mike Anusas and Tim Ingold have suggested that solar panels should hum, the better for people to be aware of the work they are doing [4]. Perhaps what we're seeing in the world—or, more accurately, not seeing—is the analogue to humming solar panels in the case of surveillance. If the data exists, but we can't see it, we're more likely to forget that it's there. But if Google Maps is made to simulate a live view, or, at some point in the future, draws on satellite or drone feeds to show an actual live view, it becomes even more difficult to deny that there is something up there looking down at us. While typically the realm of religion or philosophy, the possibility that humans are creating the very omniscient being to which we may already be subject puts a different spin on the collective work of design and human-computer interaction. What would it mean to create systems, protocols, and data standards that talk to one another, rather than to a central (corporate) server? Who would make the hardware and software to power a future where we are connected to one another without the mandatory intermediation of a few powerful companies? And, most importantly, how do we shift away from the emergent sense that surveillance is necessary, but problematic, to an awareness that there exist alternatives to the core technologies of the connected world that do not require quite so much surveillance?

back to top  References

1. McMullan, T. What does the panopticon mean in the age of digital surveillance? The Guardian. Jul. 23, 2015;

2. Tong, S. and Mastromarino, J.P. Virginia voters weigh in on a neck-and-neck governor's race. WBUR. Nov. 1, 2021;

3. Vigderman, A. Smart home technology usage, satisfaction and purchase intent: 2020. Sep. 15, 2020;

4. Anusas, M. and Ingold, T. The charge against electricity. Cultural Anthropology 30, 4 (2015), 540–554.

back to top  Author

Jonathan Bean is assistant professor of architecture, sustainable built environments, and marketing at the University of Arizona. He studies taste, technology, and market transformation.

back to top 

Copyright held by author

The Digital Library is published by the Association for Computing Machinery. Copyright © 2022 ACM, Inc.

Post Comment

No Comments Found