On November 12, 2021, the public pension fund for the state employees of Ohio filed a class action lawsuit with the Securities and Exchange Commission alleging that they purchased Class A Facebook stock at artificially inflated prices. According to the suit, these inflated prices were the result of Facebook's intentional cover-up of the harms its products caused around the world. After Frances Haugen revealed these harms in the Wall Street Journal, the suit alleges, the fund and its beneficiaries sustained substantial losses. As the court filings state, this case "arises from an egregious breach of public trust by Facebook, which knowingly exploited its most vulnerable users—including children throughout the world—in order to drive corporate profits" .
Press reports about harms done by Facebook products revealed in Haugen's testimony largely focused on the company's efforts to make products enticing to young people (as young as nine years old) even while their products, especially Instagram, were deemed to be dangerous, especially to teenage girls' well-being. In a somewhat lesser-reported but still noteworthy revelation, Haugen's testimony described Facebook's negligence in how its products were used to foment large-scale, genocidal violence against ethnic and religious minoritized populations outside the U.S. As New York Times tech columnist Kevin Roose noted in an interview  with National Public Radio's Brooke Gladstone, while Facebook has "billions of users," it does not value them all equally. In the "rest of the world," it has "too many users" and "barely anyone who can manage the platform," Roose said. In the U.S., however, he explained, Facebook is chasing younger audiences in a bid to stay relevant to advertisers, since it is the young users "that advertisers are trying to reach." While these may be separate, even diametrically opposed problems, they are nevertheless interlinked.
Chasing young users in the U.S.—and, we must admit, middle- and upper-class young users—has directly to do with their buying power, the consumer dollars they and their parents command, which are irresistible to advertisers, who make up the backbone of Facebook's revenue streams.
In the majority world, however, the average individual commands significantly fewer resources, and therefore proportionately less in social media business models. It is no surprise then that while putting significant product development resources into developing the lucrative youth market, such as through the ill-fated Instagram Kids project, Facebook has shown little effort, interest, or inclination to provide basic safety features for people living anywhere else. These features would include area experts, language experts, representation for positions of authority as well as in the everyday work of content moderation of people from caste-oppressed, Adivasi, Indigenous, Muslim, descendants of enslaved and indentured peoples, and other non-majoritarian communities—and at its most basic, more funds and support for teams working on trust and safety outside the U.S., even when these team propose solutions that contravene  the capital-accruing tendency of the company itself.
The sharp inequities exhibited by these revelations of the overheated pursuit of young eyeballs regardless of deleterious effects on youth well-being, on the one hand, and the callous disregard for how the platform is used to propagate violence and hatred for other populations, on the other, suggest an uncomfortable fact: Race, place, and position matter deeply to these tech companies, and not in the ways that their DEIA handbooks might suggest. As such, the Facebook Files exhibit a classic case of racial capitalism.
In Cedric Robinson's classic definition , which recently has been described fairly as inchoate  but nevertheless remains influential, racial capitalism refers to the fact that capitalism could not have developed without turning some into biologically predestined "haves" and others into biologically predestined "have-nots." The concept of racial distinctions, which predated the development of industrial capitalism, provided such a convenient alibi. Though some scholars justifiably critique  the concept of racial capitalism for the way it can be used as a synonym for a debilitated, deficit model of Blackness, the Facebook Files as a world event seem appropriate for the term, which explains the modes of dispossession and inclusion that categorize people, places, and actions, and binds them to the production of surplus value in particular ways. The long rise of industrialization depended on the colonial and elite rapaciousness through taxation, plantation economies, and the expropriation of Native lands, and on the enslavement of people who could be considered endlessly workable and less than human, as demonstrated in the scholarship of so many thinkers, from Jotirao Phule and Vine Deloria to Sylvia Wynter and Denise Ferreira da Silva, Paula Chakravartty, Paola Ricaurte, Thenmozhi Soundararajan, and Suzanne Kite.
Though these developments might seem safely ensconced in the deep recesses of the 19th century, the way race, and cognate concepts like caste, divides up the world matters to even such new players as social media tech companies. As developed in scholarship from Tressie McMillan Cottom, Robin Kelley, Gargi Bhattacharya, and me  in different contexts, racial capitalism addresses how these divided populations are included in the dreams of tech companies in multiple, even contradictory ways. On the one hand, some segments of racialized populations may be included as future consumers, to be courted as valuable users. On the other, some other segments are included as disposable populations, whose safety and well-being can be sacrificed to the metric of simply having billions of users worldwide. These same disposable lives are used in another way—when they produce the violent, invidious content that then circulates virally  online. In other words, the problem of race and technology cannot be solved through the inclusion of Black and Brown populations, because they are already included as populations that labor and from which value is extracted, and to which inferior, faulty, and violent products may be developed and offered up. As Charisse Burden-Stelly  argues, racial capitalism produces a calculation of value minus worth: Some bodies, especially Black bodies that intersect with systems of imprisonment, experimentation, and risky work, produce surplus value at the same time that their lives have little worth.
The Facebook Files exhibit a classic case of racial capitalism.
Given Facebook's negligent attitude toward quality control outside the U.S., one may justifiably ask whether Facebook as it appears in places like Nepal, Sri Lanka, and Myanmar is even the same product as that offered in the U.S. Given the paltry numbers of content moderators, the company's lack of linguistic and cultural competence, and its failure  to take the findings of Dalit and Muslim researchers seriously, it does not seem to be.
We have to ask, what would have happened, or may have been avoided, if the company had taken all the money, energy, concern, and brainpower devoted to chasing young Insta users (to their detriment) and put these resources into protecting lives in its preexisting products? Of course, this will not be done as long as companies like Facebook continue to see most of the world as expendable, useful only in terms of their masses. Even for well-off populations in the U.S., their ability to turn discourse toward different ends is latent; their efforts may remain largely metaphorical  and comparative. Here, too, the glittery optics of who gets to speak, about what, and for what audiences govern a capitalism that relies so heavily on racial and gendered imaginaries of expertise and trustworthiness, making Facebook whistleblower Haugen  an appropriate spokesperson for the conscience of tech, but according larger publics little authority to press for corrective measures.
In other words, the framework of racial capital is necessary in this current moment to move us beyond inclusion framed as a binary, or even as a first step on a path toward something broadly democratic. Instead, we need to show how seemly divergent problems are actually yoked together by processes of inclusion that make the terms and conditions of that inclusion disastrous and dehumanizing, in different ways and different modalities, across the globe. We need to broach very real questions about what these multiple versions of Facebook are precisely. One version targets children, another excuses some from obeying the rules of posting. Yet another tweaks the algorithm and ignores its own internal evidence of the cycles of violence that devolve from those tweaks. Meanwhile, a fourth is a marketplace for human trafficking, and a fifth treats entire swathes of its users as disposable lives. There are surely more versions as well (including the joyful ones that people all over the world create despite these tendencies). But we need to query who is mostly sorted into which version, and why?
We need to ask whether these other Facebooks that operate beyond the logics of consumer eyeballs are "junk technologies" foisted on populations who never asked for them, and where else we can look for alternatives. These are questions made possible by recognizing the multiple ways that capital, race, and technology are entwined, and these are possibilities that arise from that analysis, when all the places and beings left out of these calculations become sharply clear. These are the places to go for alternatives, if your analysis can take you there.
The chances of the current lawsuit's success are beyond my capacity to gauge. The filing nevertheless makes for compelling reading. The plaintiffs are arguing for damages to compensate for the losses they incurred when the company's stock dropped after the information contained in the Facebook Files hit newsstands. It alleges that company leadership willingly misled shareholders in several meetings about the nature of its business and the degree of risk the business was taking on. It even misled its own oversight board. On the terrain of contemporary business practices, misleading shareholders is a more prosecutable offense than is going about things in the usual way, that is, treating Black and Brown populations as extractable, extra, destroyable lives. But, on the terrain of moving past the moment of social media behemoths and their setbacks, it is precisely this usual way of doing business and its ability to sustain existing power relations across the world that must be undone.
6. Ralph, M. and Singhal, M. Racial capitalism. Theory and Society 48 (2019), 851–88; https://link.springer.com/article/10.1007/s11186-019-09367-z
7. https://journals.sagepub.com/doi/full/10.1177/2332649220949473; https://bostonreview.net/race/robin-d-g-kelley-what-did-cedric-robinson-mean-racial-capitalism; https://podcasts.apple.com/us/podcast/gargi-bhattacharyya-on-racial-capitalism/id1486603341?i=1000488792599; https://journal.culanth.org/index.php/ca/article/view/4175
9. Burden-Stelly, C. Modern U.S. racial capitalism: Some theoretical insights. Monthly Review 72, 3 (2020); https://monthlyreview.org/2020/07/01/modern-u-s-racial-capitalism/
11. Keyes, O. The Facebook whistleblower won't change anything. Wired. Oct. 18, 2020; https://www.wired.com/story/facebook-whistleblower-wont-change-anything/
Sareeta Amrute is an anthropologist who studies the relationship between race, work, and data. She is principle researcher at the Data & Society Research Institute and an affiliate associate professor at the University of Washington in Seattle. She is the author of Encoding Race, Encoding Class: Indian IT Workers in Berlin. firstname.lastname@example.org
Copyright held by author. Publication rights licensed to ACM.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2022 ACM, Inc.