XXIX.2 March - April 2022
Page: 74
Digital Citation

Toward a worker-centered analysis in fighting disinformation: Global south perspectives

Jonathan Ong

back to top 

In the Philippines, content moderators, otherwise known as "digital janitors," work tirelessly to scrub filth, gore, and lewd photos off our social media. The country is one of the biggest hubs of outsourced content moderation work, an outgrowth of its call center industry—the largest in the world [1]. Unlike moderators in other major hubs such as India and the U.S., who mostly screen content shared by people in those countries, content moderators in offices around Manila evaluate posts from all over the world [2]. In spite of the psychological trauma of constant exposure to images of killings, beheadings, and abuse, workers sign up for such ghost work [3] for a reliable source of income for themselves and their families. After all, the digital industry has long been celebrated as a "sunshine industry," where creative "knowledge workers" are hailed as modern-day heroes for their contributions to the Philippines' economic development.

back to top  Insights


Never far from content moderators are troll farms, which similarly enlist young people, often from the precarious middle classes, for influence operations. The everyday labor of disinformation production includes creating and maintaining fake accounts, designing both positive and negative ("attack") memes, and monitoring mainstream media pages in search of news stories they can express support for or attack en masse depending on their clients' whims. Low-level disinformation workers often are recent college graduates seeking extra income through a short-term side gig. Election season, which spans three months but involves a longer lead-up period where political strategists begin mobilizing their click armies and pitching their portfolio to politicians, presents many opportunities for those seeking extra cash.

As my ethnographic research into the country's disinformation shadow economies has also uncovered, some people are occasionally deceptively enlisted into a troll project. Brighteyed young people with new college diplomas may officially sign up as a graphic designer for a digital marketing agency or a legislative aide for a politician. But once these graduates become part of the everyday grind, Filipino cultural expectations of pakikisama (getting along with others) and scarily intimidating line managers pressure them to become "team players," making them operate fake accounts, pitch snarky hashtags, and produce artificial engagement for political principals, especially during election season [4]. The misogynistic and homophobic speech they are incentivized to produce then becomes the controversial content that the platforms' content moderators are expected to scrub from social media.

The geographical and social proximity of content moderation farms and the flourishing, ever-expanding disinformation industry embedded within "respectable" creative industries—and even within government offices themselves—is ironic, if not downright dystopic. Notable pro-democracy activists and journalists [5] have told the Philippines' story to the world in hopes of lobbying Silicon Valley platforms for greater accountability in the spread of hate speech and the ways in which electoral outcomes can be manipulated. My worker-centered research contribution to the disinformation debate, however, argues that we also need to deepen our understanding of the complex arrangements of our global digital economy and the porous boundaries between "respectable" digital work and "shadowy" disinformation production. The supply side of disinformation will not be addressed by focusing only on the responsibility of platforms and their content moderators, nor by deplatforming political leaders, but rather by understanding how distributed labor arrangements and the lack of regulatory oversight of digital and creative industries underpin the problem. The problem of disinformation for hire is particularly acute in the Philippines, where a personality-oriented political system compels politicians to spend on political branding experts, who clean up their image while smearing their opponents'. The examples of Bell Pottinger and Cambridge Analytica, of course, point to the global problem of how propaganda operations and dark PR firms are increasingly being hired by governments, politicians, and political parties to manipulate online discourse.

Lobbying for better and faster content moderation is fair, especially given that platforms act with urgency only when responding to a public scandal or tragedy. Researchers, however, especially those from the Global South, must find better ways to expose how disinformation production has become big business. It's crucial we shade in the exact regulatory loopholes that are being exploited by private-industry actors and assign responsibility not only to low-level paid trolls but also to the strategic masterminds behind influence operations. While it has become commonplace to call for a whole-of-society approach to mitigating disinformation, we have rarely ventured outside of Euro-American society to consider the variety of work arrangements in Global South disinformation production.

Researchers must find better ways to expose how disinformation production has become big business.

Disinformation studies, largely defined through a Euro-American lens, has framed the issue of trolls in light of white nationalist populism, Trumpism, and misogyny in the Internet "manosphere." My work on the Philippines and Southeast Asian disinformation economies argues that we need critical exploration of how a large and readily available creative workforce has taken on political consultancies and influence operations—often with loose attachment to political ideology. This inquiry helps us understand how new media crony capitalists aiming to disrupt the status quo appeal to ordinary people's entrepreneurial aspirations while stoking their feelings of distrust with the legacy media that they insist are always conspiratorially aligned with an "elite" establishment. It also points to how disinformation cannot be disentangled from economic insecurity, especially for younger people seeking to make ends meet in either content moderation or troll farms that offer steady income.

In the remaining part of this article, I'll discuss the diverse work models of disinformation production in the Philippines. I argue that the Philippines is distinct in the disinformation space not because of President Rodrigo Duterte's unique brand of populist leadership, nor because "Facebook broke democracy" in the country, but rather because of the diversity of disinformation work models at play.


back to top  Disinformation Work Models in the Philippines

Table 1 updates a similar report that I coauthored for NATO Strategic Communications Centre of Excellence [6], incorporating recent revelations by the private research firm Graphika [7], working directly with Facebook, of foreign interference. They uncovered a network of fake accounts operated by individuals in China's Fujian province.

ins03.gif Table 1. Disinformation models in the Philippines.

In 2019, my colleagues and I uncovered foreign interference in Filipino politics. We examined how Chinese businesspeople enlisted local PR firms as a boost to the electoral campaigns of those running for local-level political positions in cities where they intended to launch new businesses [8].

The diversity of disinformation models operating in the Philippines underscores that disinformation does not require a dark net to support a black market. Instead, the disinformation industry in the Philippines is embedded within the political system and the creative industries. Taking this broader perspective enables us to zoom out from ahistorical or personality-centered explanations that attribute the "infodemic" in the Philippines to the current administration or certain villainous online personalities as the main purveyors of fake news.

Many accounts of paid trolls and "fake news queens" in the media spotlight influencers or low-level fake-account operators. It is more challenging to demand political accountability from the high-level strategists orchestrating such campaigns. A few of these strategists are actually happy to take credit for the electoral victories of their clients and actively seek publicity. Journalists should also be wary of these individuals, as such notoriety increases their clout with potential political and corporate clients. The savviest of these strategists make their power moves behind the scenes. In Architects of Networked Disinformation [4], which featured ethnographic "portraits" of the diverse workers behind political campaigns in the Philippines, one strategist drew inspiration from the Game of Thrones character Olenna Tyrell, whose discreet, subtle political maneuverings commanded great respect from those in the know.

The disinformation work models described in the table are not mutually exclusive. Depending on the campaign being waged, they can be deployed in various combinations. For example, state disinformation producers or political strategists may collaborate with specialists operating clickbait websites, just as local PR firms worked with Chinese business entities to promote specific political candidates in 2019.

We need a whole-of-society approach to the Philippines disinformation industry.

Another understudied feature of the Philippines disinformation ecosystem is the psychological and moral justifications of the diverse workers behind these shady campaigns. Most disinformation work in the country is not ideologically driven, so many workers are able to distance themselves from the content they produce and the consequences of their campaigns [10]. The short-term and project-based nature of some of these campaign tasks also indicate that few people are employed as full-time trolls. Instead, they're able to displace responsibility onto others whom they consider to be bigger and "badder" actors in a wholly amoral information environment.

back to top  Tech Regulation from Below

We need a whole-of-society approach to the Philippine's disinformation industry that engages the private sector and demands higher levels of transparency and accountability. Civil society, researchers, and journalists can also help engage private industry and work as independent auditors to have open and honest conversations about the ethics of creative and digital work and identify vulnerabilities in the current system.

It is also important that researchers help put pressure on industry leaders and regulators, as journalists in the Philippines may themselves be reluctant to antagonize those who control the corporate advertising money that their news agencies depend on. For this to move forward, pro-democracy allies and foreign donors need to lend support to local civil society and researchers to advocate for tech regulation "from below," rather than reproduce the same disinformation interventions that have not adequately coped with the powerful businesspeople who are invested in keeping the current system in place. Tech regulation from below means more-robust tracking of election campaign spends, transparency mechanisms for identifying PR agencies and political consultants behind campaigns, and supporting multistakeholder alliances that can ensure fair and honest elections. Interventions must also be expansive enough to include social safety nets for precarious digital workers and privacy protections for whistleblowers seeking to expose unethical practices.

In a global context, political economy analysis and investigative journalism of the business of disinformation has exposed the inner workings of data analytics firm Cambridge Analytica and PR agency Bell Pottinger. Researchers, especially those studying the Global South, need to collaborate and build momentum in lobbying for greater transparency and accountability of digital and creative industries. These sunshine industries enlist workers for both cleaning and polluting our information environments in similarly precarious arrangements, yet their leaders have rarely had to answer for the political and human costs of their thriving enterprises.

back to top  References

1. Bajaj, V. A new capital of call centers. New York Times. Nov. 25, 2011;

2. Dwoskin, E., Whalen, J., and Cabato, R. Content moderators at YouTube, Facebook and Twitter see the worst of the web—and suffer silently. Washington Post. Jul. 25, 2019;

3. Gray, M.L. Ghost Work: How to Stop Silicon Valley from Building a New Global Underclass. Harper Business, 2019.

4. Ong, J.C. and Cabañes, J.V. Architects of Networked Disinformation: Behind the Scenes of TrollAccounts and Fake News Production in the Philippines. Newton Tech4Dev Network. U.K. and Philippines, 2018.

5. Interview with Maria Ressa. Frontline. The Facebook Dilemma. Apr. 19, 2018;

6. Ong, J.C. and Cabañes, J.V. Politics and Profit in the Fake News Factory: Four Models of Political Trolling in the Philippines. NATO Strategic Communications Centre of Excellence. Riga, Latvia, 2019.

7. Nimmo, B., Eib, C.S., and Ronzaud, L. Operation naval gazing. Graphika, Sep. 22, 2020;

8. Ong, J.C., Tapsell, R., and Curato, N. Tracking Digital Disinformation in the 2019 Philippine Midterm Election. New Mandala, 2019;

9. de Guzman, W. How social media "influencers" helped Twinmark disseminate fake news. ABS-CBN News. Mar. 8, 2019

10. Ong, J.C. and Cabañes, J.V. When disinformation studies meets production studies: Social identities and moral justifications in the political trolling industry. International Journal of Communication 13 (2019), 5771–5790.

back to top  Author

Jonathan Corpus Ong is an associate professor of communication at the University of Massachusetts Amherst. He is a research fellow at the Harvard Kennedy School's Shorenstein Center, where he leads the research project "True Costs of Misinformation," exploring the impact of targeted harassment on human rights defenders in global context.

back to top 

Copyright held by author. Publication rights licensed to ACM.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2022 ACM, Inc.

Post Comment

No Comments Found