Columns

XXVIII.6 November - December 2021
Page: 26
Digital Citation

Democracy and data fatalism


Authors:
Jonathan Bean

back to top 

In The Green New Deal, Jeremy Rifkin describes a vision of the future that is dependent on distributed computing. As Rifkin sees it, energy generation and storage, financial transactions, and the operation of autonomous transportation will be dependent on microprocessors embedded in just about every aspect of our built environment [1]. What keeps this from being a dystopian future is Rifkin's all-in approach to open-source software, which he argues is the singular way to provide enough security to keep all this essential infrastructure secure. With money and energy whizzing all over the place—not to mention cars, trucks, and buses—the stakes are high. While it's nearly certain that the payment tablets at one or two food carts are already part of a nihilist botnet army, security becomes a matter of life and death if the food cart is serving not only fusion tacos but also navigation services to nearby buses and cars.

Is my skepticism showing? Recent events close to my home in Arizona and around the world, both political and technological, have led me to question whether Rifkin's vision of the future is feasible. I'm not concerned about whether we'll have the technical capabilities—it's our shared trust in technology that seems to be slipping out of grasp.

For one, consider the politically driven recount of votes in Maricopa County, Arizona. A certain former American president not particularly happy with the outcome of a free and fair election bullied the state Senate into commandeering ballot machines for an audit. The process was assigned to a private contractor that offered an absurdly low bid, essentially guaranteeing questions about the validity of the audit and fueling further distrust. County officials warned that if the Senate and contractor did not keep a strict chain of custody they would have to decertify the machines, effectively making them useless for future elections. The rationale: The machines could not be relied upon in the future because they had been under the control of unqualified and uncertified auditors. The fear was that during the audit the machines' software could be changed in a way that would alter the outcomes of future elections without leaving a trace. The machines were replaced at a cost of nearly $3 million of public money [2].

No evidence was offered that the machines were, in fact, compromised, but the mere possibility that it could happen was enough to ensure their demise. It struck me that the voting machines were being treated as though they were religious icons. Some see them defiled by their role in electing the other side's candidate; others, by an invasive and unnecessary audit conducted by an unqualified company with the less-than-confidence-inspiring name of Cyber Ninjas. Either way, it was a foregone conclusion that the machines had to be decommissioned. Would it have been technically possible to reuse the hardware by performing the voting-machine equivalent of wiping the drive, reinstalling the operating system, and verifying the checksum? I do not know for sure, but one would hope that the manufacturer of such critically important—not to mention expensive—equipment would have some way of confirming that the systems they designed, manufactured, and leased to the government had not been compromised. They are not machines, but totems of public trust. Restoring the integrity of the system is beside the point. Voting machines have become akin to disposable diapers: Once soiled, it's a one-way trip to the trash.

The necessity of their ritual destruction reflects a growing public distrust of technology. One could argue, at the risk of being labeled elitist, that this distrust is due to a lack of education about the fundamentals of how computers and security algorithms such as the ones in election machines work. But it might be a better explanation to consider the context of recent events in the U.S., where technology, which has long carried the mythic promise of a better tomorrow, seems to be yet another dark cloud on the horizon. In May 2021, the most mundane activity of everyday life for millions of Americans, filling up the car with gas, was disrupted when a hack of the Colonial Pipeline, which carries gasoline from Texas, took out gas distribution for a good chunk of the Southeast and East Coast. People made fun of panic buyers filling up plastic bags with gasoline, but the attack on the cherished, if environmentally problematic, symbol of American mobility and freedom carried more than a hint of darkness, especially when the news broke months later that China had hacked into many other U.S. pipeline systems [3], feeding public anxiety and playing off of the latest spin in the news cycle: a renewed debate about the origins of Covid. In the same time frame, we also learned that the ransom Colonial Pipeline paid in cryptocurrency—the whole point of which, according to crypto evangelists, is to exist independent of government regulation and be utterly, impenetrably secure—had been recovered by the FBI. And amid all the strangeness of the pandemic, a chip shortage turned America's car lots, glittering symbols of plenty dotting the landscape, into lonely swaths of asphalt. It was like some low-budget horror movie making a metaphorical villain a bit too literal: First we'll take their gasoline, then their cars!

The other landmark event undermining public trust in technology, though perhaps it will be superseded by the time this column makes it to print, were the revelations of the Pegasus Project. A team of investigative journalists, working in conjunction with nonprofit organizations, uncovered a sophisticated spyware program that bypassed the security controls of the most up-to-date iPhone and Android devices. Most troubling was that the spyware does not rely on tricking the user; in fact, it can be installed without the user taking any action whatsoever. Whatever shreds remained of a reasonable expectation of privacy were destroyed by the mere existence of spyware that's nearly impossible to detect and turns smartphones into surveillance devices. There is, on the one hand, widespread awareness of the complexity and interconnectivity of the information systems that make it possible to do everything from buying coffee to adjusting a thermostat halfway around the world. But I also notice an emergent sense—let's call it data fatalism—in these shifting expectations of privacy and security, where we are headed to a commonsense notion that technology can't be trusted. This played out once again shortly after the Pegasus Project's revelations when Apple announced its iPhones would start scanning images and encrypted messages on the device to detect child pornography. Privacy advocates pointed out that the technology could be abused by repressive governments, leading Apple to delay the rollout. But in the broader context of American culture, few other than privacy advocates appear to question the necessity of exchanging their personal data for the use of a service, whether it's playing a game, using email, getting directions on a map, or finding a partner on a dating app. Definitions of data sovereignty [4], though well intentioned, seem impossibly out of touch with the agglomeration of data in a single cloud email account, let alone a user profile. Too much has been built up on top of systems that equate data with profit. Changing course would be akin to changing the official language, currency, and religion of an entire country all at once. This has happened before, of course, but not always under the happiest of circumstances.

ins01.gif

My concern is what happens when these two flows converge: one, the recognition we have so little control over own data that attempts to reclaim it are a lost cause; and two, a broader sense that technology has let us down. We have become blind to the sociotechnical systems in which we are enmeshed, instead seeing technological objects as the problem. Perhaps emboldened by the totemic quality of the voting machines, one opportunistic entrepreneur has introduced the "Freedom Phone," which early reviews indicate is nothing more than a lightly rebranded Chinese-made phone running a modified version of Android backed with questionable promises of an "uncensorable" app ecosystem [5]. Images on the Freedom Phone website make it clear that this is an example of technology literally wrapped in the flag.

Regardless of ideology, the function of such devices is to destroy, not build, trust. Democracy, always a tenuous arrangement, is a complex cultural system, not a technological object. We would do well to remember that democracy is underpinned by the parks, roads, and walls democracies build or tear down, the strategies we devise to promote health and safety, and trust. The machines we use to vote should not be employed as symbolic props to reassure a rightfully nervous public. Regardless of whether we share the vision put forth by Rifkin in The Green New Deal, it might be time for a closer examination of how we are delegating trust in our democracy to devices, from voting machines to the Freedom Phone. It is time to ask whether information technology is in fact critical to the most basic functions of civil society—and to ask whether it must be.

back to top  References

1. Rifkin, J. The Green New Deal: Why the Fossil Fuel Civilization Will Collapse by 2028, and the Bold Economic Plan to Save Life on Earth. St. Martin's Press, New York, 2019.

2. Latch, L. and Pitzl, M.J. Maricopa County will spend millions to replace voting machines turned over to the Arizona Senate for audit. The Arizona Republic. Jul. 14, 2021; https://www.azcentral.com/story/news/politics/arizona/2021/07/14/arizona-audit-maricopa-county-spend-2-8-m-replace-voting-machines/7965882002/

3. Volz, D. China compromised U.S. pipelines in decade-old cyberattack, U.S. says. Wall Street Journal. Jul. 20, 2021; https://www.wsj.com/articles/new-pipeline-cybersecurity-requirements-issued-by-biden-administration-11626786802

4. Hummel, P., Braun, M., Tretter, M., and Dabrock, P. Data sovereignty: A review. Big Data & Society 8, 1 (Jan. 2021). DOI: 10.1177/2053951720982012

5. Mihalcik, C. "Uncensorable" Freedom Phone raises a host of security questions. CNET. Jul. 16, 2021; https://www.cnet.com/tech/mobile/uncensorable-freedom-phone-raises-a-host-of-security-questions/

back to top  Author

Jonathan Bean is assistant professor of architecture, sustainable built environments, and marketing at the University of Arizona. He studies taste, technology, and market transformation. [email protected]

back to top 

Copyright held by author

The Digital Library is published by the Association for Computing Machinery. Copyright © 2021 ACM, Inc.

Post Comment


No Comments Found