As an early career and developing human-computer interaction (HCI) scholar, I, along with student peers in CS and HCI, have found it challenging to engage with academic coursework and research surrounding inclusive design, especially in areas of ethnic, gender, and cultural diversity. But after coming across a LinkedIn post  by Nicki Washington, professor of the practice of computer science at Duke University, I felt empowered and inspired to facilitate a similar experience in my home Department of Human Centered Design & Engineering (HCDE) at the University of Washington via a directed research group (DRG). DRGs are selective, for-credit, research-oriented courses that offer students a wide range of opportunities and exposure to practical research. With the support of faculty advisors, I led a DRG titled "Race, Culture, Gender & Ethics in Technology." This exploratory research experience offered students an opportunity to explore and analyze the historical and contemporary relationships of race, culture, gender, and ethics in digital media within the U.S.
|"Perpetuated Bias in Health Technologies" by Beth and Addie tackles inquiries of representation in health data and creation of equitable training sets for medical professionals and AI.
Throughout the 10-week academic quarter, the group of 13 students and one professor looked at the role of cultural, ethical, and social factors in determining the norms, values, and meanings of scientific and technological practices, engaging in deep literary analysis of existing inclusive design work and examining product/system design in the wild.
- Early Internet studies and foundations in critical race theory in digital media
- Current theories of race, racism, and sociotechnology
- Digital race and queer Internet studies
- Data feminism and power structures
- Systems of oppression in the wild
- Trust and transparency in technology
- Ethics, fairness, and inclusion in mixed reality (AR/VR)
The final deliverable for the DRG was to propose scholarly exploratory projects to express, reflect, and analyze our own perception on the intersectionality of race, culture, gender, and ethics in digital innovation. Inspired by scholar Catherine D'Ignazio, our group decided to construct a collaborative digital magazine, or zine, on the topic of society and inclusive technology. Split into teams of seven pairs, each named after a different shade of purple, students were prompted to choose from overarching themes and address the corresponding research inquiries through critical analysis or critique of scholarly works.
|"Technology Bias in ML, NLP, & AI" by Caitie and Anna outlines how ML, NLP, and AI systems reproduce biases, with the disparities in representation in computing models being a key factor.
- Perpetuated bias in health technologies
- Racial inequity in the app-based gig-worker industry
- The cost of inclusive design
- Technology and ethics
- Technology bias in ML, NLP, and AI
- Addressing algorithmic bias and promoting inclusive design
- Inclusive technology and public policy
Throughout the experience, it was apparent that students valued such content knowledge. HCI researchers, technology developers, and designers have a responsibility when it comes to reshaping how communities engage, experience, and perceive the world around them through technology . Moreover, we are prompted to assess the practices, processes, and narratives in which technical systems have been designed, developed, and researched, evaluating whether they are multidimensionally inclusive and socially good for everyone.
|"Inclusive Technology & Public Policy" by Jay and Sarah outlines the opportunities that technologists and policymakers have to collaborate on creating sustainable policies that aim to protect people and prevent harmful tech practices.
|"Addressing Algorithmic Bias & Promoting Inclusive Design" by Alaina and Ashlyn establishes ethical and inclusive design guidelines for technical teams to consider.
|"Technology & Ethics" by Gabrielle and Trinity highlights that technology is never neutral due to its existence in social, cultural, and political contexts.
2. Oleson, A., Mendez, C., Steine-Hanson, Z., Hilderbrand, C., Perdriau, C., Burnett, M., and Ko, A.J. Pedagogical content knowledge for teaching inclusive design. Proc. of the 2018 ACM Conference on International Computing Education Research. ACM, New York, 2018, 69–77.
Jay L. Cunningham is a computational social scientist whose work explores the social and ethical implications of race, culture, identity, and power at the intersection of AI, NLP, and human-computer interaction. As a servant leader, he is passionate about advocating for diversity, equity, and inclusion among marginalized populations on campus and in his communities. [email protected]
Full zine available at jaylcunningham.com
Copyright held by author
The Digital Library is published by the Association for Computing Machinery. Copyright © 2021 ACM, Inc.