Nithya Sambasivan, Garen Checkley, Nova Ahmed, Amna Batool
Not long ago, Nutan, a high school graduate in rural Uttar Pradesh, India, purchased a smartphone with her savings. She used the phone to watch how-to videos on eyebrow threading, facials, and hairstyles, and music videos by her favorite star, Ranbir Kapoor. Nutan now owns a beauty salon, building her business on the knowledge she gained from the videos. Her salon offers a nurturing environment on special occasions, such as weddings and birthdays, for women who otherwise are not allowed to leave the village without a man’s permission.
In another village in Uttar Pradesh, a khap panchayat (a kangaroo court with social influence but no legal standing) recently stipulated a $325 fine for women speaking on mobile phones in public. Phones are routinely banned in public since they lead to “affairs and elopements”—suspicions that are cast mainly on women and non-binary people. Such diktats are common in villages with traditional outlooks, leading to the tight regulation and control of women’s access to technology .
These realities are two sides of the same coin. As technology access improves around the world—either directly or through mediation—traditional social, cultural, religious, and safety norms influence access to the technology. Women and non-binary people are some of the most affected by these dynamics.
Women constitute only 29 percent of Internet users in India, despite making up nearly half the population . Throughout emerging economies, women are less likely than men to own a mobile phone or go online . Lower literacy (hence lower purchasing power), lower technical literacy, limited physical mobility, and prevailing discriminatory practices prevent many women from accessing smartphones.
We tend to listen to quantitative and qualitative data to make design decisions. We might already be sampling equal gender ratios in qualitative research. But unless we create an intentional space to discuss issues relevant to women and non-binary people, we are not listening to their voices in our research. Similarly, with only a fraction of quantitative data coming from women due to low representation and access to the Internet and devices, the aggregate doesn’t represent the people at the margins. It’s easy for women’s voices to get washed out when drawing conclusions from total numbers.
Poor gender ratio online is detrimental for the Internet as a whole. Effects include access and identity assumptions based on male users; unwelcome and sometimes unsafe experiences for women and non-binary people; biased machine-learning training data; and self-perpetuating cycles of poor adoption among women and non-binary people. The HCI community can play an important role in designing experiences that are inclusive of underrepresented women’s needs, rather than one-size-fits-all models.
Here we share our reflections on creating an inclusive design process, based on our research on gender and technology in India, Pakistan, Bangladesh, and Nigeria. We are currently expanding our research to Mexico and Brazil. We have interviewed nearly 200 women, including new mothers, rural farm workers, call-center workers, CS Ph.D. students, and bankers. We suggest four considerations to help designers and researchers navigate the contours of gender inclusion (Figure 1).
|Figure 1. A framework for gender equity in design.|
Start by asking whether women and non-binary folks can meaningfully access your technology. Access may be gated by several sociotechnical factors. First, two-thirds of women globally are non-literate. If you make literacy and fluency assumptions about your users, you may affect women.
Second, women tend to own and use devices with lower-end specifications due to phone sharing, the gifting of older phones, and lower purchasing power.
|Garment workers in Bangladesh.|
Third, women’s physical mobility may be limited. It might be less socially acceptable for a rural working woman to travel alone. An urban working woman might have more freedom, but safety concerns may reduce her mobility. Limited physical mobility affects core assumptions like the ability to access public WiFi or use transportation services or maps.
Fourth, prevailing gender norms may translate to the monitoring of women’s technology use by family members. In Bangladesh, we met women who told us their husbands install location-tracking and spyware tools on their phones to monitor their calling and movement activity. Therefore, these women borrow each other’s phones, or their employer’s, for calls or private browsing.
Finally, women’s free time may be disproportionately limited due to employment obligations and housework. For example, in Turkey, women spend more than four hours per day on housework, while men spend less than one-tenth of this amount on chores . Limited time affects how women view and access technology.
Can women meaningfully access and use the technology product, platform, or service? Ask yourself what bare-minimum specifications and capabilities are needed to meaningfully use your technology. If you are building a technology that requires physical mobility of any form, question how you can enable those with limited mobility to make use of it. Finally, do not assume the intention or the ability to always be online; design your experience so people can derive value in a short amount of time. And don’t ignore men in your design—they are often in positions of power, acting as gatekeepers to women’s access.
Women often face heavy stigma and discrimination when asking people for information on intimate and household topics. Many stages of life are health- and body-related, and therefore intimate (e.g., motherhood, wellness, first period). Other stages of life and concerns may involve highly sensitive topics such as identity, sexuality, violence, finances, and illness. Accurate information is hard to come by and may be considered taboo to discuss. Content platforms can open a conversation on highly stigmatized topics in safer, locally relevant ways.
Women may also use technology to discover communities of similar women, such as those who are lesbian or survivors of domestic violence. But if this content exists, there is often a mismatch between how platforms and users semantically describe it, making it difficult for users to find. For instance, a system might categorize as “domestic violence” content that a potential user might refer to as “beating.”
Many women have role models from within their social circles who share their family values, but there are few relatable role models to demonstrate upward mobility and career aspirations. It is much harder for women in non-traditional roles or who break sociocultural norms, by, for instance, pursuing higher education, to find relatable role models. Here a career role model with similar language, culture, or lifestyle plays an important role, not just for the women but also for their families. For example, a video on a female Bangladeshi academic may play a more important role to the mother of a Dhaka college student in communicating career paths, marriage choice, and women’s success than that of a Western academic.
Many women view social media platforms as unsafe, choosing images of flowers as profile pictures.
Community platforms often envision user journeys that end with full participation or user-generated content creation. Women may hesitate to fully participate due to social consequences and negative-feedback systems such as stalking, hate, or judgment. Our research shows that many women participants hesitate to post questions or comments, or to create posts. Women often prefer to to share anonymous or lightly filled-out profiles instead.
If you are building a content or community platform, feature locally aspirational role models with regard to gender roles, class, languages, life experiences, and sexuality, and make them discoverable. Platforms can further integrate with support communities, such as NGO groups, local support groups, or knowledge-bearers, especially for intimate topics like health and abuse. Platforms should encourage women to participate safely (e.g., provide good abuse-handling and moderation for comments) and comfortably (e.g., make other female users on the platform visible).
Women’s phones tend to get shared more frequently than male family members’ phones in some contexts. Sharing happens willingly (e.g., with children), unwillingly (e.g., husband checks wife’s messages), and inevitably (e.g., over-the-shoulder browsing in small spaces). Consider Gauri, a student in north India, whose brother regularly checks her phone to find out if she has any boyfriends. Gauri sometimes refuses, yet mostly she obliges to the social obligation of checking phones. Sometimes she uses clever privacy strategies like deleting social media images and search queries and by using an applock (apps that allow users to hide content, communications, or apps).
Offer privacy controls that are discoverable and granular. In contrast to heavy-handed controls such as locks and profiles, micro controls give people flexibility and freedom to apply them, such as controls to delete search queries, content viewed, or recommendations. Macro private states such as incognito modes are always useful, but these need to be made readily discoverable. Explain when and why information is collected in easy and transparent language, rather than in undiscoverable or jargon-heavy terms of service or privacy policies.
Women’s online activities often have consequences for offline reputation. For example, the fear of profile pictures getting morphed into pornography is ultimately about the resulting reputation damage. In 2016, a 21-year-old committed suicide in Madurai, India, the result of her social media profile picture getting superimposed over a nude and the social consequences she suffered . Incidents like these travel virally through news, social media, and word-of-mouth, accompanied by victim blaming and blanket fears of self-expression. As a result, many women view social media platforms as unsafe, choosing images of flowers and landscapes as profile pictures.
Women frequently get friend requests, chats, and phone calls from male strangers. Consequently, they are forced to learn the privacy settings of apps after negative incidents. Further, many women we spoke to hesitate to register their phone numbers during account creation or to declare their gender in applications.
Abuse is often not reported online; instead, it is hidden or dealt with through family or friends. Also, abuse-reporting terms are often undiscoverable and use unfamiliar terminology. For instance, the term blocking was more familiar to our participants than report abuse. Weak law enforcement and distrust in authority are other barriers to reporting abuse online.
Physical safety was another major concern for participants across our study. Applications that involve physical contact, such as transportation, navigation, or home-delivery services, should consider the rigorous vetting of service agents, such as drivers, couriers, or repair technicians. Relevant applications should provide and improve the discovery of location tracking, abuse reporting, and emergency help.
Ensure that abuse and content reporting are discoverable and easy to use: Focus on wording and context. When users report abuse, it is important that they have a meaningful follow-up and change in experience after reporting. We encourage product designers to advocate for digital-abuse legal rights.
Regardless of whether you are a researcher, designer, product manager, or community manager, everyone can critically examine their work to understand how they can advance gender equality. Below we suggest steps for advocating and advancing change within your institution.
First, we recommend familiarizing yourself or your institution with the issues. Understand your female and non-binary users—or potential users—and their contexts, needs, and communities. Conducting gender-focused research, or simply analyzing existing research results along gender lines, can offer a glimpse into the challenges and opportunities your team or research group might address.
Second, we recommend creating a business-case or institutional initiative around a specific challenge or goal. While many of us consider gender-inclusive design a moral obligation, this argument is hard to weigh against other organizational interests or grant applications competing for funding. Additionally, broad social goals can feel unachievable when compared with focused initiatives. Estimating the impact on female user engagement or research innovation will allow you to frame the narrative as core to the value of your organization or funding body, and might even enable the creation of a team to focus on these initiatives full time.
Third, track progress with metrics. Metrics may be either qualitative (e.g., product perception or discovery of a female-friendly feature) or quantitative (e.g., ratio of female-to-male users, or female vs. male user engagement, market gaps). Cultivating a focus on metrics and goals can help brand an initiative as rigorous, thoughtful, and core to your organization’s value.
Lastly, we encourage conversation about your initiative, both within and outside your organization. If all students, employees, press, and companies participated in public discourse on gender inclusivity, we would make faster progress as a society.
We thank Sane Gaytan, David Nemer, Lauren Johnson, Cheng Wang, Taylor Marable, Lamar Alayoubi, Asif Baki, Tracey Chan, Pratap Kalenahalli Sundaresan, Adriana Olmos, and Swati Trehan.
1. NDTV AFP. Rs. 21,000 fine for women seen using mobile phone. May 3, 2017; https://www.ndtv.com/india-news/rs-21000-fine-for-women-seen-using-mobile-phone-says-up-village-1689094
2. Jain, M. India’s internet population is exploding but women are not logging in. Scroll.in. Sept. 26, 2016; https://scroll.in/article/816892/indias-internet-population-is-exploding-but-women-are-not-logging-inia
3. GSMA. Bridging the gender gap: Mobile access and usage in low- and middle- income countries. 2015; https://www.gsma.com/mobilefordevelopment/wp-content/uploads/2016/02/GSM0001_03232015_GSMAReport_NEWGRAYS-Web.pdf
4. Chalabi, M. Internationally, women still spend more time doing chores. FiveThirtyEight. Mar. 7th, 2014; https://fivethirtyeight.com/features/internationally-women-still-spend-more-time-doing-chores/
5. Saravanan, A.P. Salem woman’s suicide case. The Hindu. June 29th, 2016; http://www.thehindu.com/news/national/tamil-nadu/Salem-womans-suicide-case-Youth-arrested-for-morphing-photos-on-Facebook/article14408398.ece
Nithya Sambasivan leads research on foundational topics with underrepresented communities in the Next Billion Users team at Google. She received a Ph.D. in informatics from the University of California, Irvine, where her dissertation focused on slum communities, sex workers, and microenterprises. email@example.com
Garen Checkley evangelizes emerging-market best practices to teams across Google. He previously led the design of YouTube Go, a new app designed and built specifically for the next billion users. firstname.lastname@example.org
Nova Ahmed is a faculty member in the Department of Electrical and Computer Engineering at North South University, Bangladesh. She received her M.S. from Georgia State University and her Ph.D. from Georgia Tech. Her research interests include HCI and distributed computing, with a focus on gender and international development. email@example.com
Amna Batool is a teaching fellow in the Department of Computer Science at ITU and director of innovation for Poverty Alleviation Lab (IPAL). She leads research on Har Zindagi, Every Life Matters, a project focused on improving children’s immunization coverage in Pakistan. Her interests include maternal and child health, and health-services delivery. firstname.lastname@example.org
Copyright held by author/owner
The Digital Library is published by the Association for Computing Machinery. Copyright © 2018 ACM, Inc.