FeaturesDialogues

XXVIII.1 January - February 2021
Page: 38
Digital Citation

Power and technology: Who gets to make the decisions?


Authors:
Jennifer Lee, Meg Young, P. Krafft, Michael Katell

back to top  POSITION: Power and Technology: Who Gets to Make the Decisions?

What would it mean to shift the balance of power, such that historically marginalized communities wielded ultimate authority in deciding if, and not just how, technologies are built? In such a world, what kinds of technologies would be allowed to exist, and what rules would govern the development and use of those technologies? With what lens and for whom would terms such as cost and benefit be defined? These are important questions to ask as students, professors, and researchers in the fields of human-computer interaction, information science, and computer science shape conversations around the creation, deployment, and regulation of technologies.

ins01.gif

When academics—as well as artists, educators, organizers, and policymakers—talk about how to build and regulate powerful AI-based tools, it is important to recognize the fact that there is a long and ugly history of technologies being designed and deployed to target, surveil, and harm those most vulnerable in our society. Technologies, however rudimentary or advanced, have always disproportionately impacted communities of color, religious and ethnic minorities, sexual and gender minorities, and other marginalized communities. We cannot ignore that technologies enable surveillance, which has been and will always be, at its root, about power. Who has the power to watch and police whom, with what tools, and for what purpose?

Underscoring power dynamics in the decision to build, design, and use a technology is critical to advancing equity because every technology reflects a set of value choices made by people, often people in positions of privilege and power. Each value choice means that different groups win or lose. We see power dynamics playing out not only in the ways we create and use technology, but also in the ways we design and deploy everything else around us. Slanted and segmented benches, bolts installed on steps, and boulders under bridges are not simply architecturally interesting features in our municipalities—they are also examples of hostile architecture purposely designed to deter people experiencing homelessness from existing in public spaces [1]. Similarly, history books, statues, and national holidays that celebrate a legacy of colonization serve to perpetuate dominant colonial culture and minimize and erase the trauma of those colonized. There is a motivation in the ways the objects around us and the culture we breathe have been forged; similarly, there is motivation in the ways in which all technologies exist. When a technology's existence, design, and purpose are fashioned without direction from those who have been historically disempowered, technology will undoubtedly serve to exacerbate existing structural inequities.

back to top  How Has Technology Been Weaponized by Those in Power?

We can see many examples of technology being used to enforce existing power structures throughout history. In 1713, New York City passed a "lantern law" that required only Black and Indigenous people to illuminate themselves by carrying a lit candle at night—the surveillance technology of the time [2]. This law incentivized white citizens to enforce slave and carceral systems using candles—a tool that disproportionately benefited white citizens while being weaponized against Black and Indigenous communities.

In the 1930s and 1940s, the U.S. government subcontracted with IBM to use its Hollerith punched-card tabulating machines for the purpose of surveilling, targeting, and unconstitutionally incarcerating Japanese Americans during World War II [3]. These were the same machines used by Nazi Germany during this time to implement its extermination campaign against Jewish people and other perceived enemies of the state [4]. While this technology was originally created for census-tabulation purposes, the technology and the data collected by it quickly became weaponized to fulfill bigoted and xenophobic identity-tracking purposes.

In the 1960s, the FBI launched a spying program called COINTELPRO, using wiretapping systems and cameras to target and indict antiwar Vietnam protestors, as well as civil rights leaders such as Martin Luther King Jr., on unfounded conspiracy charges [5].

In the 2000s, the New York Police Department used automated license-plate-reader technology (ALPRs) to power its unconstitutional, decade-long surveillance of the Muslim community. Law enforcement collected massive amounts of data about people at mosques as part of its illegal spying program.


When a technology's existence, design, and purpose are fashioned without direction from those who have been historically disempowered, technology will undoubtedly serve to exacerbate existing structural inequities.


Today, U.S. Immigration and Customs Enforcement (ICE) is partnering with companies such as Amazon, Palantir, and Microsoft, and using technologies including ALPRs, cell snooping devices, and face surveillance to target immigrants for deportation [6].

As these examples demonstrate, the creation and use of technology to surveil and harm marginalized groups is not new. However, institutions that oppress marginalized groups are now equipped with tools truly unprecedented in their surveillance power, such as facial recognition, location tracking, drones, and other AI-based tools. If increasingly powerful, invisible, and unaccountable technologies are built without adequate consideration of the impacts on communities, they will continue to exacerbate structural racism and other inequities.

back to top  How Do We Fight Back and Shift the Balance of Power?

In order to build community decision-making power, we need to focus on changing power structures within the different contexts in which we operate. Whether we are academics, artists, technologists, educators, lawyers, organizers, or policymakers, we must continuously practice sharing institutional and personal power with historically marginalized communities. Everyone must take on the role of uplifting the voices of those historically disempowered and ensuring that such voices have the most weight in deciding if, not just how, technologies are deployed.

Everyone must be an advocate. If we are not actively advocating for equity, we are working against it, by perpetuating and reinforcing the tidal wave of structural inequity shaped over centuries of colonization and dehumanization. We all have a responsibility to speak truth to power, to urge the institutions in which we work to cede decision-making authority to marginalized voices, and to share our power with those who have less. To do this advocacy, we must foster interdisciplinary relationships to leverage our different skill sets.

Everyone has a role. Academics can pressure educational institutions to give legitimacy to community voices; technologists can refuse to build tools that negatively impact marginalized communities; artists can help communicate complex concepts to many audiences; policymakers can devise legislation that bans or restricts the use of certain technologies.

When we participate in conversations on the creation, deployment, and regulation of technologies, we all have a responsibility to leverage any privilege and power we hold to question existing norms and assumptions, and defer to the expertise of communities that best understand how technologies have and continue to inflict harm.

back to top  Community Engagement is Not Enough

In doing our advocacy, it is important to recognize that historically marginalized communities are the experts on the impacts of technology and surveillance. Our goal must be to bolster the ability of communities to share their expertise and exercise decision-making power. We must not dismiss, co-opt, or exploit the lived experiences of communities to purposely or inadvertently entrench existing power structures.

Too often, the community engagement processes created by corporate, governmental, nonprofit, and educational institutions do not serve to equip communities with decision-making power, but rather function to co-opt voices and legitimize decisions that have already been made. Task forces, community engagement meetings, and outreach processes that ask community members to draft lengthy reports, provide feedback on or create recommendations, and repeatedly share their lived experiences often demand time and energy without also giving communities meaningful decision-making influence.

Additionally, community expertise in decision-making processes is often undervalued, while academic, technical, and legal voices are elevated as the only experts and given authority, even when those "expert" voices do not come from impacted communities and contradict community expertise. Without articulating the specific objective of ensuring community decision-making power, community engagement processes can function as a perfunctory and performative means to shield the status quo.

Before we embark on a community engagement process surrounding the creation and deployment of a technology, we should be questioning the norms and assumptions inherent in that process. We must urge our peers, colleagues, employers, and elected representatives to consider the following questions surrounding decision making:

  • Have communities already vocalized their support for or opposition to the technology in question, and if so, how will this feedback be considered in decision making?
  • What authority do historically marginalized communities have in deciding if, and not just how, a system is implemented?
  • Will certain groups reap benefits from the use of a technology while other groups face disproportionate harms?
  • Who gets to define costs versus benefits and weigh whether or not a technology is worth building or procuring?

Asking these questions is just the start to shifting power structures.

back to top  How Are We Working to Build Community Power?

In Washington state, we are working on shifting power structures by strengthening interdisciplinary relationships and creating spaces for impacted community voices to drive decision making. A key component of the technology and liberty work at the ACLU of Washington is growing a Tech Fairness and Equity Coalition composed primarily of representatives of historically marginalized communities. We work with this coalition to advocate for technology policies that center the voices of communities disproportionately targeted by surveillance tools. We are also working with academics, tech workers, artists, students, and organizers to build toolkits that expand the capacity of impacted communities to question policymakers regarding the deployment of powerful AI-based technologies.

We recently launched the Algorithmic Equity Toolkit, a collaborative project with the Coalition and the Critical Platform Studies Group that was driven by impacted communities expressing the need for tools to better analyze AI-based technologies (https://www.aclu-wa.org/AEKit). Similarly, we are working with the Coveillance Collective to build countersurveillance toolkits incorporating history, art, community stories, and movement into our understanding of the impacts of technology and surveillance (https://coveillance.org/). We hope to continue to collaborate with impacted communities, students, technologists, artists, educators, and organizers to develop tools led by and for communities.

To highlight some recent advocacy and legislative work, we and the Coalition are:

  • Fighting hard to pass a statewide face-surveillance moratorium law in Washington that would give communities the opportunity to decide if facial-recognition technologies should be used at all. In 2020, the face-surveillance moratorium bill we supported passed unanimously out of the House policy committee, with many impacted community members testifying in front of law makers; this year, we hope to advance the moratorium proposal even further.
  • Urging the Port of Seattle to prohibit the use of facial-recognition technology at port facilities and reject collaboration with U.S. Customs and Border Protection—a sister agency of U.S. Immigration and Customs Enforcement (ICE)—to implement face-surveillance systems.
  • Working with communities to draft and introduce people-centric privacy legislation in the 2021 Washington state legislative session. In 2020, we worked hand in hand with impacted communities and collaborated with students, academics, technologists, lawyers, and policymakers to successfully defeat a weak, corporate-centric data-privacy bill that would have set a ceiling for privacy protections in Washington.
  • Advocating for a statewide algorithmic accountability bill that would make it illegal for government agencies to discriminate using AI-based automated decision systems.

As we do this advocacy, we are in the continuous process of learning how to best share our power and lift up community decision-making power. We recognize that the process of shifting power dynamics and fighting for community power is never finished and not easy, and cannot be done in siloes. We must come together as advocates to leverage our different skill sets, learn from successes and failures, and share our personal and institutional power if we want to create a world in which historically marginalized communities wield authority in deciding if, and not just how, technologies are built and deployed.

back to top  Position References

1. Chellew, C. Defending suburbia: Exploring the use of defensive urban design outside of the city centre. Canadian Journal of Urban Research 28, 1, (2019), 19–33.

2. Schomburg Center for Research in Black Culture, Photographs and Prints Division, The New York Public Library. 'A law regulating Negroes & slaves at night.' The New York Public Library Digital Collections; http://digitalcollections.nypl.org/items/510d47db-bd0f-a3d9-e040-e00a18064a99

3. Seltzer, W. and Anderson, M. After Pearl Harbor: The proper role of population data systems in time of war. Annual Meeting of the Population Association of America, 2000; https://margoanderson.org/govstat/newpaa.htm

4. Milton, S. Registering civilians and aliens in the Second World War. Jewish History 11, 2 (1997), 79–87.

5. United States Senate Select Committee to Study Governmental Operations with Respect to Intelligence Activities. Intelligence Activities And The Rights Of Americans. 1976.

6. National Immigration Project, Immigrant Defense Project, and Mijente. Who's Behind ICE? The Tech and Data Companies Fueling Deportations. 2018.

back to top  Author

Jennifer Lee is the technology and liberty manager at the ACLU of Washington (ACLU-WA), where she advocates for state and local legislation to regulate powerful surveillance and AI-based technologies. She leads ACLU-WA's work drafting and implementing technology policies that center community voices and protect everyone's privacy and civil liberties. [email protected]

back to top  RESPONSE: A Response to 'Power and Technology:' A Call for Scholar Activism

Alvaro Huerta [1] defines a scholar-activist as both a bridge and conduit between academic institutions and the communities in which they work, putting the resources and privilege of the former into service for the latter toward the ultimate object of advancing social, racial, and economic equity. This definition clarifies several things about what scholar-activism means. First, scholar-activists think of their institutions, research, and platforms as resources that can be directed. Second, scholar-activists intentionally direct these resources toward communities outside academia. Third, even as some career incentives in academia are at odds with this goal, scholar-activists commit to social justice. We three authors of this piece began working together under the name the Critical Platform Studies Group (CritPlat) in 2018. As a small research collective of early career scholars, we have been inspired by the scholar-activist mode of engagement over the past two years. In this article, we point to other research in computing that has inspired us, share what pursuing an activist mode has meant for us so far as a group, and report on what challenges we have faced along the way.

ins02.gif

All academics dedicate time and energy to their work, putting something of themselves and their values into it. Even in computing, many scholars have done politically charged work since the beginning of the profession, including the notable early work of Computer Professionals for Social Responsibility [2]. Speaking to an ACM and design audience, then, what is at stake in aligning ourselves more explicitly to an activist politics?

For us, this orientation has meant that we begin our research with a theory of social change. We take as our starting point that while many projects in computing and data science to date have been oriented toward social good [3], not all of these projects fit within Huerta's conception of scholar activism as a practice of channeling resources, building relationships, and addressing the social, racial, and economic structural disparities in our society head-on as such. An activist orientation requires us to see research as action, to build capacity, and to contribute to broader social movements.

back to top  Inspirations and Aspirations

We take inspiration from researchers who set out to support and amplify activist objectives. For instance, Our Data Bodies is a research justice project that works with local communities to develop practical tools for data literacy, digital self-defense, and community resilience in the face of digital society. NoTechForICE, organized by the migrant rights organization Mijente, is a national grassroots, student-led campaign for divestment from Palantir and other tech companies collaborating with the U.S. Immigration and Customs Enforcement agency, combining direct action with bespoke research on university involvement with companies such as Palantir. We look to projects like Lilly Irani and Six Silberman's Turkopticon as an exemplar of how to build a system to improve the labor conditions of precarious workers, while challenging the exploitative platform design of a major tech company. We also look to the empirical, conceptual, and methodological examples set by scholarship on algorithmic justice [4], data feminism [5], and design justice [6].

We similarly take inspiration from the thought and guidance of racial and social justice activists. In her 2017 book, Emergent Strategy: Shaping Change, Changing Worlds [7], Detroit-based activist adrienne marie brown provides a set of conceptual tools for stronger and more sustainable social justice movements. A central theme is the charge of Detroit civil rights activist Grace Lee Boggs to "transform yourself to transform the world," a concept that brown encapsulates through the metaphor of the fractal. To say that social change is fractal is to underline how our personal interactions and daily lives interconnect to larger-scale stakes, such as our research choices, institutional context, and society. In this respect, scholar-activism is not just about what we decide to research but also how we show up with others. At an April 18, 2019, talk to the Seattle Public Library, brown asked:

How many people would say again, "In my intimate relationships I am practicing transformative justice"?… How many of you right now would say there's a functioning democratic process happening in your household… your block… your city?… Thinking fractal[ly] we [must practice] at the small scale something that we can actually bring up to a larger scale.


As academics, our personal, departmental, and community-scale changes can be the staging ground for larger societal shifts.


As academics, our personal, departmental, and community-scale changes can be the staging ground for larger societal shifts. In these settings, scholar-activists can refer to pragmatic guidance set out by social justice groups, such as the Bay Area Solidarity Action Team's "Protocol and Principles for White People Working to Support the Black Liberation Movement" (Figure 1). Some of the principles that brown raises, however, highlight fundamental tensions for academics interested in doing this work. Brown explains the importance of interdependence and decentralization for a healthy social movement. But academic career advancement depends on taking credit and centering yourself, and the organizing and campaigning of supporting social movements is not easily reflected on a CV.

ins03.gif Figure 1. The Bay Area Solidarity Action Team (BASAT)'s protocols and principles, as excerpted by brown in Emergent Strategy [1].

back to top  Our Experiences to Date

We turn again to Huerta's conception of scholar activism as creating bridges and connections between academic institutions and the communities in which they work, putting the resources and privilege of the former into service for the latter toward the ultimate object of advancing social, racial, and economic equity.

The first project we three authors completed together was inspired by an effort to think of our academic institutions and platforms as resources that can be directed. In early 2019, we met an activist from the Greenlining Institute in Oakland, California, Haleema Bharoocha, who was looking for partners willing to co-organize and host an event on racism and bias in machine learning. In gaining support for this effort via funding set aside by the University of Washington Information School for a speaker series on computational social science, we were able to provide university space, awareness, and honoraria to an event held on campus in April 2019 to a local audience and by livestream. The panel event, "Racism and White Supremacy in Algorithmic Systems," featured local advocates, activists, community organizers, and educators. Bharoocha directed the steps we took to ensure the event was inclusive, such as ensuring the space was wheelchair accessible and scent-free; and that it would have sign language interpreters, space reserved for elders, and plus-size seats. We learned the value of sharing a list of definitions for key terms on printouts for audience members, coordinating on content with panelists beforehand, and circulating press releases. Writing Bharoocha's prompts for the audience on the walls, we started audience conversations before and after the event with Post-it notes and markers. The conversations between those present before, during, and after the event was one way to help foster relationships and build capacity.

Our second project was inspired by the idea that scholar-activists intentionally direct resources toward communities outside academia. Initiated within the University of Washington eScience Institute's Data Science for Social Good program, our Algorithmic Equity Toolkit project co-created a set of tools for community advocates to use in posing critical questions about government technologies. Following a participatory action research approach, we aimed to center and be accountable to the goals of our partners at ACLU of Washington (ACLU-WA) in what we made and how it evolved over time. Our partnership with ACLU-WA allowed us to connect with activists and community organizers in the Seattle Tech Fairness Coalition, a group of local civil rights organizations engaged in the fight for surveillance reform. As Jennifer Lee underlines in her piece "Technology and Power," we aimed to be respectful of the time of these partners; in some cases we were able to pay community advocates for their expertise. However, a key challenge we encountered was in how to fund the work as a community-based project. At the time, the only readily available funding was from Big Tech. We ultimately decided to turn down a grant from Amazon in favor of maintaining partners' trust and the political commitments of the project. This gap in funding left each of us precarious in different ways, and much of our time on the project became in-kind. The privilege to freely dedicate work to unfunded projects is one that many people do not have, and was not without material and career impacts for us either.

A third project of ours aimed to intervene on the tensions between the incentives in advancing academic careers and our commitments to social justice. We were inspired in part by the #FundingMatters campaign, in which scholars in our field criticized the Privacy Law Scholars Conference and Amsterdam Privacy Conference for receiving funding from Palantir in solidarity with the immigrants' rights group Mijente. In response, in 2019 we organized a panel at the ACM Computer-Supported Collaborative Work (CSCW) conference that interrogated tech-industry funding called "Patron or Poison? Industry Funding of HCI Research." In it, we asked how industry funding might impact the scope and content of the work we do. Our panel event featured both early-career scholars and those more senior and central to the community: industry researchers and researchers based in universities. Immediately following the panel, we staged a guerilla tabling event with a spray-painted banner parodying that year's CSCW logo (Figure 2). We used our table to distribute zines and collect anonymous submissions about how industry funding had impacted respondents' work. This DIY action built on energy from that year's ACM Conference on Human Factors in Computing (CHI), in which an attendee had defaced a poster listing the conference sponsors. We ask the community to continue building on this energy and by reimagining conferences as sites of direct action, disruption, and joy, where we can intervene in our own community in an effort to foster new forms of discussion.

ins04.gif Figure 2. CritPlat disrupting the ACM Computer-Supported Collaborative Work (CSCW) conference. The hand-painted sign hanging from our table replaces "Computer" with "Corporate," reading "CSCW 2019: Corporate Supported Collaborative Work."

back to top  Points of Direction

Even as we reflect on what in our work we want to deepen or rethink, we share the following directions to others on the same path as we are.

Use progressive stack. Progressive stack is the strategy of facilitating meetings and large discussion groups by eliciting contributions from people in marginalized groups before those from people who are not. It was popularized at the Occupy Wall Street protests in 2011. This strategy is used to remediate recognized inequities in who is most likely to speak and be heard in order to prevent majoritarian decision making. It can inspire many aspects of scholar-activist practice, from adopting feminist citation practices [8], leading class discussions, panel invites, mentoring, hiring, and other ways in which voices are amplified in scholarship. We can also direct opportunities toward others. Guzman and Amrute: "Ask yourself, for each topic you present, each yes or no you give to a request, where are the women of color? Who can I suggest who would be a better person than me to be the expert here? Who do I want to be in community with?" [8].

Publish a funding integrity statement. Writing a funding integrity statement (and making it available in spaces where we put our bio or CV) is the practice of disclosing the sources of funding we accept as scholars, under what conditions, and what criteria we use to make decisions about financial support to accept.

Channel resources. Resources, defined broadly, can include mentoring time, granting a platform, or amplifying on social media—in addition to funding and job opportunities. Just as the Our Data Bodies project by Tawana Petty, Mariella Saba, Tamika Lewis, Seeta Peña Gangadharan, and Kim Reynolds works with community organizers, we can channel the resources available to us to those already doing the work in order to support their existing efforts.

Follow objectives of communities. Scholar-activists can begin their work by following a specific community's or organization's stated needs (cf., [6]). A robust tradition called participatory action research provides a clear methodology for such an approach. By working in concert with partners through an action-reflection cycle, researchers are more likely to produce knowledge or tools that can be put in service to current efforts and strategies.

Foster long-term relationships. Given the pressure to publish in academic institutions for career advancement, the timelines between academic and community objectives is likely to misalign. Fostering long-term relationships with communities, advocacy organizations, and activist groups can make our research more accountable to the people we work with, and, more important, makes us present for the organizing and advocacy needs to which we can put our own voices, time, power, and resources. Research that does not arise from the genuine needs of community activists or that is not pursued on the basis of long-term trusting relationships will be extractive.

Direct action in conferences. We call for academic computing conferences to become places where speakers might be interrupted by a noisy protest from an organized group of students or other hubbub. Our scholarly communities are active conversations among colleagues and friends—not idle literatures. What if dissenting views were presented with arts, banners, tabling, teach-ins, and performances? What if interventions into the field were literal and joyful?

Some people we have met have reservations about using the term activist. Some worry that as researchers, aligning ourselves with activists might compromise public trust in academic institutions. Others have a very different concern, which is that academic work does not rise to the ethos or standard of activism and should not seek to center itself in any struggle. In spite of these unresolved tensions, as academics adjacent to Big Tech firms in our collaborations and funding relationships, researchers choosing where to direct our time and energy, and educators for a new generation of technologists, we bring considerable power as scholars to this present moment. We can look for ways to turn this power toward existing efforts for social change.

back to top  Conclusion

We write this piece midstream in an unfinished process of learning how to do research, how to engage politically, and how to maintain relationships with partners over time. Here we have attempted to expose our process to this point. But our process is an evolving one and we hope to dedicate our careers to this end. In sharing the guideposts we have used along the way, we hope to have surfaced some conceptual tools and considerations that will be useful to others at our same juncture. In this fraught moment, this search for ways to turn the considerable time and attention dedicated to our work to broader purposes has made scholarship more rewarding. We will continue to look for ways to steer in this direction and look forward to connecting with others heading the same way.

back to top  Response References

1. Huerta, A. Viva the scholar-activist! Inside Higher Education. Mar. 30, 2018; https://www.insidehighered.com/advice/2018/03/30/importance-being-scholar-activist-opinion

2. Finn, M. and DuPont, Q. From closed world discourse to digital utopianism: The changing face of responsible computing at Computer Professionals for Social Responsibility (1981–1992). Internet Histories 4, 1 (2020), 6–31.

3. Pal, J. The fallacy of good: Marginalized populations as design motivation. Interactions 24, 5 (Sep.–Oct. 2017), 65–67.

4. Buolamwini, J. The Algorithmic Justice League. MIT Media Lab, 2016; https://www.media.mit.edu/projects/algorithmic-justice-league/overview/

5. D'Ignazio, C. and Klein, L.F. Data Feminism. MIT Press, 2020.

6. Costanza-Chock, S. Design Justice: Community-led Practices to Build the Worlds We Need. MIT Press, 2020.

7. brown, a.m. Emergent Strategy: Shaping Change, Changing Worlds. AK Press, 2017.

8. Guzmán, R.L. and Amrute, S. How to cite like a badass tech feminist scholar of color. Data & Society: Points. Aug. 22, 2019; https://points.datasociety.net/how-to-cite-like-a-badass-tech-feminist-scholar-of-color-ebc839a3619c

back to top  Authors

Meg Young is a postdoctoral fellow at the Cornell Tech Digital Life Initiative in New York City. Her research focuses on accountable uses of government technology and data, with a focus on procurement processes for automated decision systems. She received her Ph.D. from University of Washington Information School. [email protected]

P. M. Krafft is a senior research fellow at the Oxford Internet Institute in the University of Oxford's Social Science Division. Krafft's research, teaching, and organizing aim to bridge computing, the social sciences, and public sector work toward the goals of social responsibility and social justice. [email protected]

Michael Katell is a postdoctoral fellow at the Alan Turing Institute in London, England. His current research concerns the ethical and policy implications of digital technology use in the criminal justice system. He received his Ph.D. from the University of Washington Information School. [email protected]

back to top 

Copyright 2021 held by owners/authors

The Digital Library is published by the Association for Computing Machinery. Copyright © 2021 ACM, Inc.

Post Comment


No Comments Found