Mac Arboleda, Palak Dudani, Sayash Kapoor, Lorna Xu
Digital platforms obfuscate the power arrangements that undergird them in part by mimicking aspects of real life in their user interface. A Web page for answering questions becomes a "forum," a group on Facebook becomes a "community," all to make interaction with digital platforms seem frictionless. Carceral logics for interaction are embedded in the design of digital platforms, manifesting in reporting and deplatforming being the limited ways in which conflict is resolved within online communities. Community standards reinforce carcerality and mimic legal language to convey a false sense of legitimacy, without inputs from any "community." But while digital platforms might mimic these aspects of real life, they don't operate by the same rules.
We are better able to discern and parse the social values of "analog" spaces that we navigate in real life than digital platforms because the underlying values of online platforms are obfuscated. In a city, a bar may require an entry charge while a mall lets you in for free with an expectation that you'll consume. A café might let you use their bathroom as long as you buy a cup of coffee, as opposed to a public park, where you can loiter without having to shell out money. Through an audiovisual project, we turn digital platforms into physical city spaces as an interrogation of their unstated values (see Figure 1).
We chose five city spaces that parallel the operating dynamics of social media platforms (see Figure 2). Facebook isn't a public square; it's a mall that aims to keep you inside while profiting off your presence. Double standards in platform governance and content moderation that favor big advertisers and state agents represent a gated community that is closed off to the public. Top-down and anti-Black content-moderation practices mimic the carceral logic of the prison. Content-moderation sites are the digital poultry farm—invisible and yet indispensable to the platform. And finally, in imagining a truly open community space, we end with a provocation: Where are the public parks on the Internet? Each of these sections consists of visual representations we created accompanied by audio snippets from journalists, activists, and artists whom we interviewed about the logics that underlie social media platforms. These snippets provide insights on censorship, labor, and commodification, backed by personal experiences as workers from different fields with varied relationships with digital platforms.
|Figure 2. Two Screenshots showing the Web project in action. Left: The digital platform is the city. The city consists of a mall, a gated community, a prison, a poultry farm, and a public park. Right: The on-screen layout of each city space.|
"We've got it all for you" — SLOGAN OF SM SUPERMALLS, THE LARGEST MALL CHAIN IN THE PHILIPPINES
In Manila, the mall is where you can buy basic necessities such as food, medicine, and clothing. It's also where you can pay for housing, electricity, and water. Most of the time, it's where many public transportation stations are located. The mall is designed to keep you inside, to accumulate as much profit as possible (see Figure 3). As the slogan for the most popular mall chain in the Philippines, SM Supermalls, goes, "We've got it all for you." The centralization of needs in privatized space undercuts public infrastructure. Digital platforms are implemented as a substitute, if not an outright replacement, for infrastructure that needs to exist. Facebook even claimed to defy commodification in its old slogan "It's free and always will be," but we know there are costs to using its services.
Digital platforms commodify our attention and profit from advertisers committed to selling a message. These entities rely on data shared by Facebook, which also accommodates partnerships with governments. Facebook takes money from advertisers promoting disinformation, posting fake ads soliciting personal information, and political repression. Malls in real life mirror the constant surveillance in digital space through the practices of facial recognition and data mining under the guise of community "safety." Logging on to Facebook is like walking into the mall, and once you leave it's like you never actually left: Digital platforms will work even harder to find the products you are most likely to buy, offer attractions that will bring you back, and expand its extractive practices in the name of "better service."
Under the guise of neutrality and fairness, signaling their progressive values while remaining complicit in crimes against humanity, social media platforms systematically favor the powerful (see Figure 4)—and so politicians are not fact-checked, content from big advertisers and media outlets is not subjected to moderation or scrutiny, and platforms prioritize business interests before anything else, bowing before whoever occupies the state infrastructure in a given nation-state.
"We do not submit speech by politicians to our independent fact-checkers, and we generally allow it on the platform even when it would otherwise breach our normal content rules." — NICK CLEGG, FACEBOOK VP OF GLOBAL AFFAIRS
While a tiered hierarchy is inherent in the design of social media platforms, the extent to which platforms rely on this hierarchy to maintain legitimacy is hidden. So, while Facebook might claim that its content-moderation practices are neutral and apply to everyone on the platform equally, big advertisers and media houses receive preferential treatment when it comes to moderation. One example is media outlet Breitbart receiving a free pass to post misinformation on Facebook.
Digital platforms replicate carceral logics through surveillance and content moderation. Platforms craft content-moderation policies reflecting their business interests rather than those of the communities using their platforms. This top-down approach fails to address specific local contexts in which content is created. In cases where "local context" is taken into account, it often relies exclusively on the state's narrative. With little to no explanation or recourse, these policies are enforced through punitive measures of content removal and deplatforming. Under the guise of neutrality, dissenters and activists are the first people to be banned, shadow-banned, or deplatformed, their ability to connect and engage cut off. Political dissidents are silenced due to state interests, while hate speech by state agents continues to proliferate (see Figure 5).
"[Platforms] have the power to enforce policies and laws through their own perspective only, and that's not the best solution." — MARIAN HUKOM, GRAPHIC ARTIST
Policing and prisons uphold the anti-Blackness fundamental to the function of capitalism and colonization. Digital platforms in turn reflect these values through the act of content moderation. Black people are more likely to have their content flagged when using African-American vernacular English on these platforms, where they also face increased scrutiny and harassment from other users.
Being banned from these platforms, which serve as spaces for maintaining relationships and sharing information, severs peoples' ties with their communities, effectively isolating and silencing them. A more effective approach to content moderation and platform governance would take into account the shared values and perspectives of community members in ways that allow for shared accountability structures and otherwise judicious ways of interacting in digital spaces.
Scholar and abolitionist Carrie Freshour describes the ubiquitous poultry farms as establishments representing meat production while creating and maintaining exploitative labor practices . Poultry farms are an extension of the prison industrial complex, as they almost exclusively rely on cheap labor by being one of the few businesses willing to hire former inmates, and then set exploitative working conditions that often cause premature disability. Similarly, content-moderation sites subject moderators to exploitative work environments. Hazardous work conditions, long workdays, and conditions that promote ill health are labor practices that content-moderation sites and poultry farms share, leading to the content-moderation site becoming a digital poultry farm (see Figure 6).
|Figure 6. The image shows the similarity of the repetitive tasks for workers in both situations.|
"[Content moderators] on social media who are usually outsourced from developing countries, in their jobs they are forced to stomach grotesque images and videos…they do not [get] credited for the work they do." — SAMANTHA DEL CASTILLO, JOURNALIST
We look at content-moderation sites not as inevitable sites for exploitation, but rather as places to imagine alternatives. What are collective alternatives to platform governance and moderation that do not necessitate labor exploitation?
The problem becomes more apparent when we observe that the city does not truly welcome its inhabitants (see Figure 7). Malls displace open spaces for profit; gated communities reproduce carceral approaches, pooling resources for VIPs and politicians; prisons restrict freedoms; and exploitative, dehumanizing poultry farms are hidden from view while their chickens are crucial to the survival of the city.
|Figure 7. If the digital platform is the city, then where are the public parks?|
How can we design a better city? How can we exercise our rights and freedoms without capitalism moderating us as content? Eli Pariser, codirector of Civic Signals, an initiative that envisions flourishing and public-friendly digital spaces, says that "functional public spaces are central to this work. They allow us to assemble, to share common experiences. … Great public spaces are owned by everyone and therefore ought to be designed for everyone ." What can public spaces on the Internet look like? How can we collaboratively design these spaces as truly ours? If we can't build these public parks yet, at the very least we should be imagining futures where we can. How can we share the city?
This article is an accompaniment to the collaborative Web project "The Platform as the City," which can be viewed at bit.ly/platform-as-city.
1. Fishour, C. Poultry and prisons. Monthly Review, (July 27, 2020); https://monthlyreview.org/2020/07/01/poultry-and-prisons/
2. Pariser, E. To mend a broken Internet, create online parks. Wired, (Oct. 13, 2020); https://www.wired.com/story/to-mend-a-broken-internet-create-online-parks/
Mac Andre Arboleda is an artist and graduate student at the University of the Philippines. He is the Founding President of the UP Internet Freedom Network and co-founder of the Artists for Digital Rights Network. See his work at www.sickinternet.me, email@example.com
Palak Dudani is Service and Interaction Design Specialist at Fjord, part of Accenture Interactive in Oslo, Norway. She is a hybrid designer whose core interest lies in exploring the social and cultural dimensions of complex systems. firstname.lastname@example.org
Sayash Kapoor is a Ph.D student at Princeton University, Princeton, NJ, where he studies overoptimism in machine learning at the Center for Information Technology Policy. email@example.com
Lorna Xu is a designer and organizer based in Los Angeles, CA, interested in decommodified housing and abolistionist futures. Her current day job is Product Designer at the Walt Disney Company. firstname.lastname@example.org
Copyright held by authors.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2021 ACM, Inc.