Forums

XXVI.2 March - April 2019
Page: 70
Digital Citation

Not-equal: Democratizing research in digital innovation for social justice


Authors:
Clara Crivellaro, Lizzie Coles-Kemp, Alan Dix, Ann Light

back to top 

Digital technology has given rise to extensive socioeconomic transformation and emerging technologies are set to further transform the service economy and public services. Careful design and deployment are needed for this transformation to benefit the many rather than the few. If harnessed to the wrong economic, political, and social models, technological innovation has the potential to be detrimental to the most vulnerable; its careless application can amplify existing forms of injustice and create new forms of exclusion in socioeconomic life, further exacerbating socioeconomic inequality and social division.

back to top  Insights

ins01.gif

Advances in artificial intelligence (AI), machine learning, and their application within the platform economy, smart cities, and digital services raise new concerns for HCI and IxD researchers. These include: questions about the logic and biases embedded in the design of algorithms, big data analytics, data provenance, and their governance; how the very systems HCI conceives may contribute to different forms of oppression [1]; and the politics and business models underlying systems in the sharing economy, including new, often invisible, struggles experienced today by platform-economy workers [2]. As it stands, more often than not, digital technology seems to simply reinforce and reproduce existing economic systems of industry, commerce, health, and government to the detriment of decency, dignity, and care. Ultimately, digital technology can either reinforce inequality or help mitigate it.

With Not-Equal, we invite HCI communities to join us in holding a mirror up to HCI practice and the systems we create as a community to identify, develop, and support the conditions for digital technology to support social justice. Not-Equal is a new initiative that aims to build a sustainable network of interest and resource collaborations between multidisciplinary researchers and communities from industry, civic, and civil society. Besides explorations of the interrelationship between technology and oppression [3], taking a social justice approach for us means attending to how both the fundamental design of technologies and their application can more equitably realize the positive impact of digital technologies for all.

In line with participatory design traditions and calls for pluralism in computing [4], we believe that answering the question of what may be unjust as a result of digital innovation requires collective efforts and the perspectives of all involved. The social sciences have developed sophisticated ways to analyze technologies once they are in society; HCI researchers, computer scientists, and R&D labs are at the forefront of conceiving and experimenting with new technologies; charities championing social justice have a thorough understanding of the issue their service users face; government bodies and policymakers have a responsibility to the people they serve and safeguard; people are experts at their own lives.

It is for these reasons that Not-Equal brings together researchers and partners in industry, civil, and civic society (including informal groups) to understand and explore issues of fairness and justice in technology design and implementation and to co-create the necessary responses to make our digital society work for the many, not just the few. Through the funding and resourcing of events and collaborations, we hope to create opportunities for engineers to rub shoulders with policymakers and charities working in areas of deprivation, and for universities and schools to pioneer new methods for understanding the implications of digital innovation.

back to top  Challenges Posed by Digital Innovation

Beyond the digital divide, we recognize a need for a systematic scrutiny of the technologies that perpetuate social injustice at the micro- (the level of technology design and interaction), meso- (the application of technologies across contexts and domains), and macro-levels (the bodies and systems that regulate the development and use of particular technologies). We must bring the actors operating at all these levels in dialogue with one another and with those who are at the receiving end of their innovations and interventions. We also need to establish ways to speak back to the societal forces that compromise social justice in technology development. We have identified three interrelated challenge areas in need of particular attention: algorithmic social justice, digital security for all, and fairer future for businesses and workforces.

Algorithmic social justice is focused on co-developing responses to the challenges posed by the new data economy, including the opacity of AI systems and the exclusive access to the data and data processing used in the digital services and automatic semi-autonomous decision making that affect us all. We use algorithms and digital systems to sort through large datasets and make decisions about healthcare, prison sentencing, our personal lives, and the management of public services, including who might be eligible to benefit from them. Both the politics and the logics underlying the design of these algorithms, data-processing capabilities, and their applications deserve to be publicly scrutinized, understood, and, if appropriate and possible, challenged and changed. With our partners, we wish to question algorithmic characterizations of fairness, and to understand and explore how notions of social justice might be operationalized and evaluated in designing the processes of data-driven algorithmic decision making.

Digital security for all explores how we can co-create with our network of partners and their communities a digital security that is more inclusive. We do this through design approaches that make digital security in IoT and digital-service design more responsive to issues of agency, capability, and socioeconomics. By making security-modeling approaches that are more accessible and better able to articulate the socioeconomic and political contexts, we provide ways for groups and individuals to identify and acknowledge different securities and insecurities at work within digital designs. We also use creative engagements to encourage groups to consider the wider context in which digital services and technologies are deployed so that the securing processes of threat questioning, identification, and response are encouraged and considered as part of the ongoing digital design.

ins02.gif Not-Equal call out and challenge areas.

Fairer future for businesses and workforces examines how economic forces coupled with network opportunities are creating new challenges to society as markets go global, jobs become “gigs,” and worker protections seem beyond the reach of these individualized service-provision arrangements. Terms such as the sharing economy suggest a benign, citizen-led cooperation facilitated by digital technology, but, in practice, they often involve supra-national corporations exercising control reminiscent of the worst excesses of the Industrial Revolution. While the platform economy offers winner-take-all status for new marketplaces and brokering businesses, they thrive on the less visible production of services and data by fleets of competing providers. With our partners, we are looking at the processes and systems in the platform economy to explore the barriers and opportunities. We seek to re-envisage the role of digital technology, offering fresh patterns for processes and systems to realize equity in economic opportunities for all.

back to top  Democratizing Research: Co-Creating Responses to Make a Fairer Sharing and Data Economy

Co-creating responses with our network partners to issues of social justice requires us to move beyond exploring questions of ethics in technological innovation; we need to also include ways in which we can meaningfully involve all sections and sectors of society in collaborative HCI research. Making space for those who are systematically excluded from partaking in digital innovation is a core principle for Not-Equal. For us, this includes all sections of society, not solely those who may be considered marginalized. Everyone is affected by technological developments, albeit in different ways, and there is a long gestation between development and deployment into everyday lives, which could be a time for ordinary people to understand and challenge potential uses.

Our aspiration is to help the conditions for social justice to emerge through the resourcing and funding of Open Events, proposed by partners themselves or tailored to their interests and needs. These activities explore a specific issue within one or more challenge areas from different perspectives, develop ideas further, and/or build capacity for the development of responses or innovative technologies. We aim to use outputs from these explorations to shape and inform our Open Commissioning Programme, through which we support and resource network partners’ short- or longer-term collaborations responding to the burning issues they have identified. We are concurrently working with citizens (whether legally recognized or not) to constitute a Community Panel. Alongside a panel of experts from academia, industry, and civil society, the Community Panel will be tasked with reviewing and deciding what proposals should be commissioned. In this way, we explore models and approaches to democratize and commission research in digital innovation and lay the groundwork for possible meaningful justice-enhancing collaborations.

Democratizing research in technological innovation in the sharing and new data economy, however, presents new challenges. Recent studies [5] have highlighted the lack of a shared understanding and vocabulary around AI among researchers, civic society, governments, and policymakers. Computer scientists and machine-learning experts are not used to explaining, in lay terms, their working processes and the workings of the algorithms they conceive, or engaging with the social-justice implications of these. The complexities underlying algorithmic design and the ways in which we experience and live with data necessitate the development of new methods to support meaningful explorations of data forms and materialities, including their applications and societal impacts.

With our network partners, we wish to investigate new and more sophisticated ways to make the invisible visible, and what may be deemed intangible tangible, in order to enable public explorations of current issues and their possible consequences. We also wish to enable AI and machine-learning computer scientists and security practitioners to better examine the social-justice dimensions of their work through reflexive and creative engagements.

HCI and IxD practitioners have a wealth of experience in conducting and facilitating cross-disciplinary work that can lend to the crafting of spaces for meaningful dialogic collaborations between those at the receiving end of innovations and actors operating at the micro-, meso-, and macro-levels of digital innovation. Further, new collaborations between HCI, machine learning, law, and economics scholars and practitioners can offer new exciting ways to examine the consequences of innovations and new spaces to operationalize responses. Taking a postdisciplinary approach to practice-led research, we aim to foster spaces for partners to reflect on what each practitioner/scholar does, assess what counts as knowledge and value, challenge assumptions on common issues, and develop practical responses by recruiting from different knowledge areas as and when needed. Such an approach, albeit familiar to most IxD and HCI communities, might require experts and practitioners to work on their empathy and to develop and put in practice a new level of generosity—the ability to put themselves in someone else’s shoes, embrace a different perspective, and adapt responsively the knowledge they can lend to a particular problem.

ins03.gif Just or unjust? A wall exploring the social justice of everyday smart devices and technologies.

With this, we hope to nurture a culture in society and digital innovation that is able to examine digital technologies’ underlying (un) just systems and develop responses or propose alternatives. As algorithms are produced within a social system, asking what may constitute justice in digital services and for whom might be the departure point. As part of this, so far we have developed a range of activities and public campaigns—“Just or unjust?” “Computer says…” and “DIY digital protection.” They aim to support people to explore familiar and less familiar smart devices and technologies, articulate issues and experiences with computer systems, and pinpoint the one thing they would change to improve these and who should help make this change. Included are the often taken-for-granted daily practices people use to protect themselves online and in their uses of IoT.

Outputs from Open Events and network activities will serve to develop the agenda within each challenge area and pinpoint the commissioning of responses and research proposals the network should support. These might include the rapid prototyping and proof of concept of a new technology, a reconfiguration and application of existing technologies and sociotechnical processes, or supporting the development of toolkits to better understand the social justice implications of existing technology or guide their novel application within a social justice framework. All materials and outputs from Not-Equal engagement activities and commissioned projects will be rigorously open source and available to partners to scale out, replicate, and take further. That “injustice anywhere is a threat to justice everywhere” [6] couldn’t be perhaps more evident today: Digital technologies and their consequences are pervasive and arguably have no borders—we must work together, everywhere.

We are at the start of this exciting initiative. Make a difference. Join the Network: www.not-equal.tech/#join

back to top  References

1. Fox, S., Dimond, J., Irani, L., Hirsch, T., Muller, M., and Bardzell, S. Social justice and design: Power and oppression in collaborative systems. Companion of the 2017 ACM Conference on Computer Supported Cooperative Work and Social Computing. ACM, New York, NY, 2017, 117–122.

2. Clark, L. The gig economy threatens to take us back to pre-Industrial Revolution times. Wired. June 23, 2017; https://goo.gl/WmN69U

3. Dombrowski, L., Harmon, E., and Fox, S. Social justice-oriented interaction design: Outlining key design strategies and commitments. Proc. of the 2016 ACM Conference on Designing Interactive Systems. ACM, New York, 2016, 656–671.

4. Björgvinsson, E., Ehn, P., and Hillgren, P-A. Participatory design and “democratizing innovation.” Proc. of the 11th Biennial Participatory Design Conference. ACM, New York, 2010, 41–50.

5. Artificial Intelligence–what do the public really think about its potential impact? Involve, Report; https://goo.gl/sxLsQE

6. Martin Luther King, Letter from Birmingham Jail, 1963.

back to top  Authors

Clara Crivellaro is an HCI researcher based at the School of Computing’s Open Lab, Newcastle University. She specializes in interaction design research and the design of tools and sociotechnical processes to support democratic practices, social justice, and engagement with communities, industry, and civil society organizations. clara.crivellaro@ncl.ac.uk

Lizzie Coles-Kemp is a security researcher based at the Information Security Group, Royal Holloway University of London. Her focus is the intersections between security of society, security of the individual, and security of technology. She specializes in working with underserved communities in the context of their digital interactions. lizzie.coles-kemp@rhul.ac.uk

Alan Dix is director of the Computational Foundry at Swansea University, a project that aims to grow the digital research base in Wales for the good of the region, nation, and world. As well as HCI research, He has interests in rural economy, creativity, algorithmic bias, and reimagining industry. alan@hcibook.com

Ann Light addresses themes of social and ecological justice, the co-making of futures, and the politics of design. She specializes in the social impact of technology, bringing a background in arts, humanities, AI, and human-computer interaction to bear on innovation in social process and cultural change using participatory methods. ann.light@sussex.ac.uk

back to top 

Copyright held by authors. Publication rights licensed to ACM.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2019 ACM, Inc.

Post Comment


No Comments Found