Features

XXVIII.3 May - June 2021
Page: 52
Digital Citation

A framework for evaluating social acceptability of spatial computing devices


Authors:
Arathi Sethumadhavan, Josh Lovejoy, David Mondello

back to top 

Spatial computing devices are designed to integrate digital information more directly within a wearer's cognitive processes than traditional computing devices. They do this via techniques such as holographic projection, immersive spatial audio, haptic feedback, artificial intelligent assistants that are audible only to the wearer, and direct manipulation of synthetic content. While such information can enhance the wearer's environmental or contextual awareness, the use of these devices in social interactions can also contribute to a reduction in social presence for both device wearers and bystanders.

ins01.gif

From the perspective of the wearer, digital content within their field of view and hearing has the potential to obscure conversation partners, which could make it difficult to respond to established nonverbal cues provided by their interlocutors. Additionally, augmented reality content can create inattentional blindness or even hijack attention [1]. From the perspective of bystanders, it can be difficult to assess a device wearer's conversational attention given the wearer's potentially simultaneous engagement with digital content [2].

back to top  Insights

ins02.gif

In addition to complicating conversational cues, the presence of spatial computing devices in social interactions can also change the epistemic ground from which each social actor operates. Users of spatial computing devices have access to digital information about their environment and potentially about the people standing nearby. While some devices include public-facing indicators (e.g., recording LEDs), numerous examples exist of simple hardware modifications that render such notification methods ineffective [3]. The ability of device users to access digital information privately and covertly during social interactions with bystanders creates an asymmetric power dynamic. Therefore, the introduction of new technologies into society requires careful consideration of changes to existing social norms.

back to top  Social Norms

Social norms are the unwritten rules that govern what's considered appropriate behavior in groups. They are developed incrementally—and understood uniquely by each person—through a continuous process of experimenting with behavior and evaluating the reactions one receives. Meanwhile, technological innovation is often quite disruptive (by design) and therefore it will almost always leap ahead of the existing social norms. It takes time for people to integrate a new device or capability into their habits, figure out what it is and isn't useful for, and calibrate their expectations about how their friends, colleagues, and others may perceive their use of it.


By considering how technology may alter expectations about social norms, intentional design choices can be made to proactively address mismatches between the conceptual models of users and bystanders.


By considering how technology may alter expectations about social norms, intentional design choices can be made to proactively address mismatches between the conceptual models of users and bystanders. Those mismatches are most likely to appear at the intersections of social acceptability, which is explored in the next section.

In social contexts, technology alone cannot earn trust. Instead, trust must be designed as a relationship between people and measured by their ability to depend on one another to play by a shared set of rules, where technology doesn't cause them to inadvertently commit a foul. Therefore, we recommend paying particular attention to the four considerations for social norms shown in Table 1 when designing spatial computing devices.

ins03.gif Table 1. Considerations for social norms.

back to top  Social Acceptability

"Social acceptability is determined when the motivations to use technology compete with the restrictions of social setting" [4]. Some assessments of social acceptability have focused on the perspective of individuals in control of the technology [4,5]. Those subjected to the operation of the technologies, but lacking active control, have received less attention. More recent assessments of social acceptability, however, have elevated the role of bystanders [6,7]. But these evaluations have not adequately anticipated or addressed bystander concerns about privacy and security, resulting in widespread public pushback and limited adoption [2]. Such product outcomes underscore the importance of designing for bystanders and including this stakeholder group in evaluations of social acceptability.

An additional measure of how bystanders (i.e., those who are nearby) perceive the technology can help product teams design the product in ways that ease the discomfort of both bystanders and users. For example, bystanders could experience unease when subjected to ubiquitous sensing devices such as mixed reality headsets, companion drones, or meeting transcription tools. Likewise, in the presence of bystanders, end users could experience the weight of social stigma or even the glow of social desirability. By treating bystanders as first-class citizens and conducting evaluations of social acceptability throughout the development process, technology developers can begin to intentionally design solutions that respect existing social norms and preserve the agency of technology users and bystanders alike.

Key components of social acceptability. This section outlines the primary factors that play an important role in social acceptability.

The factors described in Table 2 can often interact in important ways that impact users' perception of how socially comfortable they feel when they are using a technology. For example, the social acceptance of virtual reality headsets among strangers (audience) could differ depending on whether the headsets were used in a public environment dedicated to virtual reality or on plane (location). As such, it is important to consider the ways these factors work together in determining social acceptability.

ins04.gif Table 2. Social acceptability criteria.

The social acceptability metric. Previous evaluations have defined—and measured—social acceptability in a few different ways. Some highlight the tension between the desire to use technology and the restrictions of social settings, such as the audience, location, and gesture considerations outlined earlier. Others have measured the degradation of social interaction. Still others have explored the role of personal style and impression management.

Drawing on items from existing inventories of social acceptability measures, we propose a measure of social acceptability along three primary themes that preserve the shared set of rules that govern interpersonal exchanges between users and bystanders:

  • Prioritize interpersonal attention and focus. It must be obvious to bystanders when the user is attending to controls or digital content.
  • Equitable access to information. Anything the user can learn about bystanders must be available in both directions.
  • Respect existing customs. The technology must not override important rules governing social interactions in each location in the presence of an audience.

Table 3 presents these three themes along with statements used to measure them. When conducting these evaluations, all items are evaluated by both technology users as well as bystanders, using a 5-point Likert scale (1-strongly disagree, 5-strongly agree). The average rating for each statement must be greater than 4.0 to pass the evaluation. In addition to providing Likert scale ratings, users and bystanders will also have an opportunity to supply open-ended responses that inform the rationale behind their ratings ("Please explain why you rated in this way"). This metric can be used in dyadic interviews (1 user, 1 bystander), surveys, or citizen juries.

ins05.gif Table 3. Social acceptability themes and their measures.

Finally, in Table 4 we summarize some best practices for evaluating social acceptability.

ins06.gif Table 4. Best practices for measuring social acceptability.

back to top  Conclusion

Through consideration of the social conditions that shape acceptable use—along with a review of the relevant technical details that underpin device functionality—the metric proposed in this article seeks to address the critical factors that constitute spatial computing wearables as sociotechnical systems for both device wearers and bystanders, and reveal opportunities for promoting social behavior while also respecting social norms.

back to top  References

1. Slater, M. et al. The ethics of realism in virtual and augmented reality. Frontiers in Virtual Reality 1 (2020).

2. Due, B.L. The social construction of a Glasshole: Google Glass and multiactivity in social interaction. PsychNology J. 13 (2015), 149–178.

3. Koelle, M., Wolf, K., and Boll, S.C. Beyond LED status lights: Design requirements of privacy notices for body-worn cameras. Proc. of the 12th International Conference on Tangible, Embedded, and Embodied Interaction. ACM, New York, 2018, 177–187; https://doi.org/10.1145/3173225.3173234

4. Rico, J. and Brewster, S. Gestures all around us: User differences in social acceptability perceptions of gesture based interfaces. Proc. of the 11th International Conference on Human-Computer Interaction with Mobile Devices and Services. ACM, New York, 2009, Article 64; https://doi.org/10.1145/1613858.1613936

5. Ahlström, D., Hasan, K. and Irani, P. Are you comfortable doing that? Acceptance studies of around-device gestures in and for public settings. Proc. of the 16th International Conference on Human-Computer Interaction with Mobile Devices & Services. ACM, New York, 2014, 193–202; https://doi.org/10.1145/2628363.2628381

6. Koelle, M., Ananthanarayan, S., and Boll, S.C. Social acceptability in HCI: A survey of methods, measures, and design strategies. Proc. of the 2020 CHI Conference on Human Factors in Computing Systems. ACM, New York, 2020, 1–19; https://doi.org/10.1145/3313831.3376162

7. Miller, M.R., Jun, H., Herrera. F., Yu Villa, J., Welch. G., Bailenson, J.N. Social interaction in augmented reality. PLOS ONE 14, 5 (2019), e0216290; https://doi.org/10.1371/journal.pone.0216290

8. Harms, C., and Biocca, F. Internal consistency and reliability of the networked minds social presence measure. In Seventh Annual International Workshop: Presence 2004. M. Alcaniz and B. Rey, eds. Universidad Politecnica de Valencia, Valencia, 2004.

9. Kelly, N. and Gilbert S. The WEAR scale: Developing a measure of the social acceptability of a wearable device. Proc. of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems. ACM, New York, 2016, 2864–2871; https://doi.org/10.1145/2851581.2892331

10. Reeves, S., Benford, S., O'Malley, C., and Fraser M. Designing the spectator experience. Proc. of the SIGCHI Conference on Human Factors in Computing Systems. ACM New York, 2005, 741–750; https://doi.org/10.1145/1054972.1055074

back to top  Authors

Arathi Sethumadhavan leads User Research for Ethics & Society at Microsoft, where her team brings the perspectives of a diverse set of stakeholders to help shape products. She is also a Fellow at the World Economic Forum, where she is working on unlocking opportunities for positive impact with AI. arathi.sethumadhavan@microsoft.com

Josh Lovejoy leads Design for Ethics & Society at Microsoft, where his team works at the intersection of product development, user experience, AI, law, and philosophy. Prior to Microsoft, he was UX lead for People+AI Research at Google and founded Amazon's unified design system for online-shopping experiences. josh.lovejoy@microsoft.com

David Mondello is a design researcher at Microsoft, focused on eliciting the voices of traditionally overlooked stakeholders to help inform the development of emerging technologies. His recent work has included preserving the value of professional expertise in human-AI collaboration. He is currently pursuing an M.S. in the Department of Human-Centered Design & Engineering at the University of Washington. david.mondello@microsoft.com

back to top 

©2021 ACM  1072-5520/21/05  $15.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2021 ACM, Inc.

Post Comment


No Comments Found