Features

XXIII.5 September + October 2016
Page: 40
Digital Citation

Sensing the future of HCI: Touch, taste, and smell user interfaces


Authors:
Marianna Obrist, Carlos Velasco, Chi Vi, Nimesha Ranasinghe, Ali Israr, Adrian Cheok, Charles Spence, Ponnampalam Gopalakrishnakone

The senses we call upon when interacting with technology are restricted. We mostly rely on vision and hearing, and increasingly touch, but taste and smell remain largely unused. Although our knowledge about sensory systems and devices has grown rapidly over the past few decades, there is still an unmet challenge in understanding people’s multisensory experiences in HCI. The goal is that by understanding the ways in which our senses process information and how they relate to one another, it will be possible to create richer experiences for human-technology interactions.

ins01.gif

To meet this challenge, we need specific actions within the HCI community. First, we must determine which tactile, gustatory, and olfactory experiences we can design for, and how to meaningfully stimulate them in technology interactions. Second, we need to build on previous frameworks for multisensory design while also creating new ones. Third, we need to design interfaces that allow the stimulation of unexplored sensory inputs (e.g., digital smell), as well as interfaces that take into account the relationships between the senses (e.g., integration of taste and smell into flavor). Finally, it is vital to understand what limitations come into play when users need to monitor information from more than one sense simultaneously.

Insights

ins02.gif

Thinking Beyond Audiovisual Interfaces

Though much development is needed, in recent years we have witnessed progress in multisensory experiences involving touch. It is key for HCI to leverage the full range of tactile sensations (vibrations, pressure, force, balance, heat, coolness/wetness, electric shocks, pain and itch, etc.), taking into account the active and passive modes of touch and its integration with the other senses. This will undoubtedly provide new tools for interactive experience design and will help to uncover the fine granularity of sensory stimulation and emotional responses.

Moreover, both psychologists and neuroscientists have advanced the field of multisensory perception over recent decades. For example, they have provided crucial insights on the multisensory interactions that give rise to the psychological “flavor sense” [1]. The development of taste and smell interfaces, and subsequently flavor interfaces, is still in its infancy; much work will be required to create multisensory-based systems that are both meaningful to people and scalable. Nevertheless, technology is advancing rapidly, including some one-off designs such as LOLLio [2], MetaCookie+ [3], and Tongue Mounted Digital Taste Interface/Taste+ [4] (Figure 1).

Taste+ is an example of how multisensory interaction could improve dining experiences (which, by definition, are multisensorial [1]). The user can augment the flavors of food and beverages by applying weak and controlled electrical pulses on their tongue using electronically enhanced everyday utensils such as spoons and beverage bottles. The initial experimental results show that users perceive virtual salty and sour sensations.

Moving Toward the Chemical Senses

Here we want to highlight that there are opportunities to enhance designers’ and developers’ abilities to create meaningful interactions and make use of the whole spectrum of sensory experiences. However, there are still many challenges when studying taste and particularly smell, especially related to inter-subject variability, varying olfactory preferences over time, and cross-sensory influences. No other sensory modality makes as direct and intense contact with the neural substrates of emotion and memory, which may explain why smell-evoked memories are often emotionally potent.

Smell and taste are known as the chemical senses because they rely on chemical transduction. We do not yet know entirely how to digitize these senses in the HCI context compared with others like sound and light, where we can measure frequency ranges and convert them into a digital medium (bits).

As a community, we need to explore and develop design methods and frameworks that provide both quantitative and qualitative parameters for sensory stimulation. In the case of touch, the process is well facilitated through the proliferation of haptic technologies (from contact to contactless devices), but we are still in the early stages of development for taste and smell. However, we are now ahead of the technological development due to the rich understanding achieved by psychology and neuroscience. We thus have the opportunity to shape the development of future taste- and smell-based technologies (Figure 2) [3]. A basic understanding of how these chemical senses could be characterized from an HCI design perspective can be established.

For instance, Obrist et al. [5] investigated the characteristics of the five basic taste experiences (sweet, salty, bitter, sour, and umami) and suggested a design framework. This framework describes the characteristics of taste experiences across all five tastes, along three themes: temporality, affective reactions, and embodiment. Particularities of each individual taste are highlighted in order to elucidate the potential design qualities of single tastes (Figure 3). For example, sweet sensations can be used to stimulate and enhance positive experiences, though on a limited timescale, as the sweetness quickly disappears, leaving one unsatisfied. It’s a pleasant taste but one that is tinged with a bittersweet ending. In contrast to the sweet taste, the sour taste is described as short-lived, often coming as a surprise due to its explosive and punchy character. This taste overwhelms with its rapid appearance and rapid decay. It leaves one with the feeling that something is missing.

How is This Information Useful for HCI?

LOLLio, the taste-based game device, currently uses sweet and sour for positive and negative stimulation during game play. We suggest that our framework could improve such games by providing them with fine-grain insights on the specific characteristics of taste experiences that could be integrated into the game play. For example, when a person moves between related levels of a game, a continuing taste like bitter or salty is useful based on the specific characteristics of those tastes. Whereas when a user is moving to distinct levels or is performing a side challenge, an explosive taste like sour, sweet, or umami might be more suitable. The designer can adjust specific tastes in each category to create different affective reactions and a sense of agency.

There are already a number of suggestions from the context of multisensory product design. For example, Michael Haverkamp [6] has put forward a framework for synesthetic design. The idea here is to achieve “the optimal figuration of objects based upon the systematic connections between the senses.” For that purpose, Haverkamp suggests that designers need to take into account different levels of interconnections between the senses, such as the relations between (abstract) sensory features in different modalities (e.g., visual shape and taste qualities) or semantic associations (e.g., as a function of a common identity or meaning) that can for instance be exploited in a multimedia context (Figure 4).

Directions for Future Research

Based on multisensory experience research, it is possible to think of a variety of directions for the future. For example, the research on taste experiences presented here can be discussed with respect to their relevance for design, building on existing psychological theories on information processing (e.g., rational and intuitive thinking). The dual process theory, for instance, accounts for two styles of processing in humans: the intuition-based System 1 with associative reasoning that is fast and automatic with strong emotional bonds; and reasoning based on System 2, which is slower and more volatile, being influenced by conscious judgments and attitudes. That said, the rapidity of the sour taste experience does not leave enough time for System 1 to engage with it and triggers System 2 to reflect on what just happened. Such reactions, when carefully timed, can prime users to be more rational in their thinking during a productivity task (e.g., to awaken someone who may be stuck in a loop). Moreover, an appropriately presented taste can create a synchronic experience that can lead to stronger cognitive ease (to make intuitive decisions) or reduce the cognitive ease to encourage rational thinking. Note, of course, that taste inputs will generally be utilized with other sensory inputs (e.g., visual) and thus the alignment or misalignment, or congruency, of the different inputs (in terms of processing style, emotions, identity, or so on), can result in different outcomes (positive or negative).

Research of this kind could allow designers and developers to meaningfully harness touch, taste, and smell in HCI and open up new ways of talking about the sense of taste and related experiences. People often say things like “I like it. It is sweet,” but the underlying properties of specific and often complex experiences in HCI remain silent and consequently inaccessible to designers. Therefore, having a framework that includes more fine-grain descriptions such as “it lingers” and “it is like being punched in the face,” which have specific experiential correlates, can lead to the creation of a richer vocabulary for designers and can evoke interesting discussions around interaction design.

Furthermore, it is crucial to determine the meaningful design space for multisensory interactive experiences. For example, we rarely experience the sense of taste in isolation. Perhaps, aiming for the psychological flavor sense would be a way to go, as we combine taste, olfactory, and trigeminal/oral-somatosensory inputs in our everyday life whenever we eat or drink. Here, it is key to think about congruency and its ability to produce different reactions in the user. At the same time, it is also key to understand the unique properties of each sensory modality before designing for their sensory integration in the design of interactive systems.

Studying these underexploited senses not only enhances the design space of multisensory HCI but also helps to improve the fundamental understanding of these senses along with their cross-sensory associations.

Acknowledgments

To learn more please take a look at our successful CHI 2016 workshop (http://multi-sensory.info/chi2016workshop/) and our upcoming special issue in IJHCS (http://www.journals.elsevier.com/international-journal-of-human-computer-studies/call-for-papers/special-issue-on-multisensory-human-computer-interaction).

References

1. Spence, C. Multisensory flavor perception. Cell 161, 1 (2015), 24–35.

2. Murer, M., Aslan, I., Tscheligi, M. LOLLio: Exploring taste as playful modality. Proc. of TEI 2013. 299–302.

3. Narumi, T., Nishizaka, S., Kajinami, T., Tanikawa, T., and Hirose, M. Augmented reality flavors: Gustatory display based on edible marker and cross-modal interaction. Proc. of CHI 2011. 93–102.

4. Ranasinghe, N., Karunanayaka, K., Cheok, A.D., Fernando, O.N.N., Nii, H., and Gopalakrishnakone, P. Digital taste and smell communication. Proc. of the 6th International Conference on Body Area Networks. ICST, 2011, 78–84.

5. Obrist, M., Comber, R., Subramanian, S., Piqueras-Fiszman, B., Velasco, C., and Spence, C. Temporal, affective, and embodied characteristics of taste experiences: A framework for design. Proc. of CHI 2014. 2853–2862.

6. Haverkamp, M. Synesthetic Design: Handbook for a Multi-sensory Approach. Birkhäuser Verlag, Basel, 2013.

7. Burgess, M. We got sprayed in the face by a 9D television. Wired (May 20, 2016); http://www.wired.co.uk/article/9d-television-touch-smell-taste

Authors

Marianna Obrist is a reader in interaction design at the University of Sussex, U.K., and head of the Sussex Computer Human Interaction (SCHI “Sky”) Lab (http://www.multisensory.info/). Her research focuses on the systematic exploration of touch, taste, and smell experiences as future interaction modalities. m.obrist@sussex.ac.uk

Carlos Velasco (http://carlosvelasco.co.uk/) is a member of the Crossmodal Research Laboratory, University of Oxford, U.K., and a postdoctoral research fellow at the Imagineering Institute, Iskandar, Malaysia. His research focuses on crossmodal perception and its applications. carlosvelasco@protonmail.com

Chi Thanh Vi is a postdoctoral research fellow at the SCHI Lab at the University of Sussex. He is interested in using different brain-sensing methods to understand the neural basis of user states, and the effect of taste on decision-making behavior. C.Vi@sussex.ac.uk

Nimesha Ranasinghe (http://nimesha.info) is a research fellow at the National University of Singapore. His research interests include digital multisensory interactions (taste and smell), wearable computing, and HCI. During his Ph.D. studies he invented virtual taste technology. nimesha82@gmail.com

Ali Israr is a senior research engineer at Disney Research, Pittsburgh, USA. He is exploring the role of haptics in multimodal and multisensory settings such as VR/AR, wearables, and handhelds, and in gaming. israr@disneyresearch.com

Adrian David Cheok (http://adriancheok.info) is director of the Imagineering Institute, Iskandar, Malaysia, and a chair professor of pervasive computing at City University London. His research focuses on mixed reality, HCI, wearable computers and ubiquitous computing, fuzzy systems, embedded systems, and power electronics. adrian@imagineeringinstitute.org

Charles Spence (http://www.psy.ox.ac.uk/team/charles-spence) is the head of the Crossmodal Research Laboratory, University of Oxford, U.K. His research focuses on how a better understanding of the human mind will lead to the better design of multisensory foods, products, interfaces, and environments. charles.spence@psy.ox.ac.uk

Ponnampalam Gopalakrishnakone is professor emeritus in anatomy at the Yong Lin School of Medicine, National University of Singapore, and chairman of the Venom and Toxin Research Programme at the National University of Singapore. gopalakrishnakone_pon@nuhs.edu.sg

Figures

F1Figure 1. Digital Taste Interface: A method for simulating the sensation of taste by actuating the human tongue through electrical and thermal stimulation [4].

F2Figure 2. AromaShooter, a smell-delivery device, contains six scent cartridges and connects to a computer via USB. (Developed by Aromajoin)

F3Figure 3. Three characteristics of taste experiences combined for each of the five basic tastes: temporality (the duration of the taste experience indicated from left to right); affective reactions (green pleasant, red unpleasant, and orange neutral experience); and the embodied mouth feeling.

F4Figure 4. 9D TV: An example of multisensory integration while watching a movie investigated by the SCHI “Sky” Lab team at the University of Sussex [7].

©2016 ACM  1072-5220/16/09  $15.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2016 ACM, Inc.

Post Comment


No Comments Found