Features

XXIV.6 November + December 2017
Page: 38
Digital Citation

The how and why behind a multisensory art display


Authors:
Damien Ablart, Carlos Velasco, Chi Vi, Elia Gatti, Marianna Obrist

back to top 

Designing multisensory experiences has always fascinated artists and scientists alike. In recent years, there has been a growing interest in multisensory experience design within the HCI community [1]. Next to advances in haptic technologies, we see novel work on olfactory and gustatory systems [2,3] and efforts in determining multisensory design spaces [4]. Moreover, artists, museum curators, and creative industries are interested in those emerging technologies for their own work. Here we present Tate Sensorium, a multisensory art display, as an example case for multisensory design.

ins01.gif

Tate Sensorium was the winning project of the 2015 Tate Britain IK Prize award (http://www.tate.org.uk/about/projects/ik-prize). The aim of this project was to design an art experience that involved all the traditional five human senses. To achieve this goal, a cross-disciplinary collaboration between industry, sensory designers, and researchers was formed (see list of partners in the Acknowledgments). Flying Object (http://www.weareflyingobject.com/), a creative studio in London, led the project. We, the team from the University of Sussex Computer-Human Interaction Lab (SCHI; pronounced “sky”), advised on the design of the multisensory experiences, including new tactile sensations through a novel mid-air haptic technology [5], and on the evaluation of the visitors’ experiences. Here, we focus on our contribution to the design process (the why and how); details on the evaluation and findings can be found in [6].

back to top  Insights

ins02.gif

back to top  Selection of the Artwork

All the stakeholders of Tate Sensorium contributed to the artwork-selection process, which was supervised by Flying Object. At the beginning, all kinds of art pieces were considered, including paintings and sculptures. However, due to practical constraints (i.e., artwork availability, exhibition space), only paintings were considered for the multisensory art display. This was later further narrowed down to abstract work, as it leaves space for interpretation, which may be “colored” with multisensory content (i.e., auditory, tactile, olfactory, and gustatory stimuli).

The first set of selections included 60 abstract paintings. The final decision was made in June 2015. Four paintings were chosen based on their overall potential for multisensory complementarity and their availability for the duration of the exhibition. Due to copyright issues, images of the actual paintings cannot be included in this article, though we illustrate them as they appeared at Tate Sensorium (Figure 1).

ins03.gif Figure 1. The four paintings and their multisensory components included in Tate Sensorium (from left to right): Interior II by Richard Hamilton, Full Stop by John Latham, In the Hold by David Bomberg, and Figure in a Landscape by Francis Bacon.

back to top  Multisensory Design Process

The design of the different multisensory experiences for the four paintings followed four main steps.

Step 1: The project team generated ideas for each of the four paintings in a series of workshops and iterative sessions. The team followed principles of rapid prototyping, testing scent samples, combinations of food ingredients, and variations of audio and mid-air haptic patterns [6].

Step 2: In order to avoid sensory overload, the team assigned a leading sense (and corresponding sensory stimulus) alongside a secondary sense to each painting. The aim was to find the best balance between the visual characteristics of the painting and the additional sensory inputs.

Step 3: After the project team agreed on steps 1 and 2, sub-teams were formed to collaborate on the design of specific multisensory experiences for each painting under the overall supervision of Flying Object.

Step 4: During the final month before the exhibition opened, the different multisensory designs were iteratively refined. They were deployed and pilot tested at the dedicated space in the art gallery within the last two weeks before the opening.

Below we describe the final multisensory arrangements for each of the four paintings (see overview in Table 1).

ins04.gif Table 1. Selected paintings and their associated sensory stimuli (the leading sense is in red).

#1 Interior II: The experience designed for this Richard Hamilton painting integrated scents and sounds (Figure 1a). The sounds were presented using four speakers, one in each corner of the room, to create quadraphonic sound (aka digital surround sound). Scents were delivered using three Olfactive Spirit Pro perfume diffusers (http://www.signatureolfactive.com/), which were placed on the side walls of the room. Each of them delivered a specific scent: (1) the scent of the late 1940s (spicy carnation fragrances), which fits the look of the woman in the painting, (2) the solvent and glue aroma related to the materiality of the work shown at the back of the room in the picture, and (3) the scent of cleaning products related to the construction of the interior/parquet surfaces.

#2 Full Stop: Visitors experienced this painting together with sound and haptic inputs (Figure 1b). For the latter, we used mid-air haptic stimuli, which creates tactile sensations without physical contact on the users’ hand (i.e., palm), to create a feeling of dry rain or a blow through a straw on the skin [5]. The soundscape, created by the sound designer, emphasized the interplay between the positive and negative space of the artwork, especially the painting’s duality of black and white. This was further highlighted and synchronized with the tactile sensation, designed as a combination of a circle-shaped pattern, mirroring the roundness of the painting, that changed in size and scale, and a rain-like pattern, which referenced artist John Latham’s use of spray paint.

#3 In the Hold: This painting was experienced together with scents and sounds (Figure 1c). The sound was presented using four directional speakers. The auditory stimuli were designed to bring the listener toward the painting, through two planes of sound. One plane addressed the geometry of the painting (David Bomberg’s quest for “pure form”), with acute angles and jagged sounds. The second plane explored the subject matter of the “hold.” The scent stimuli had a similar function, with two aromas integrated in two 3D-printed objects: One scent aimed to be shrill, to bring out the blue in the painting, while the other was a diesel and tobacco blend. Both scents were presented at low concentrations and were paired with the sounds.

#4 Figure in a Landscape: Visitors experienced Francis Bacon’s painting with taste, scent, and sound stimuli (Figure 1d). The sound was presented to the visitors via headphones. The taste stimulus was delivered in form of chocolate (praline) on a plinth, in a bed of tiny chocolate bits that evoked soil. This taste depicted the painting’s dark, harsh nature and the wartime era with multiple ingredients, namely, charcoal, sea salt, cacao nibs, and smoky Lapsang souchong tea. It also aimed to reference the London park setting and flashes of color with burnt orange. The accompanying scent aimed to convey a sense of Hyde Park’s smellscape: grass, soil, and earth, but also a horsey scent. The auditory stimuli referenced the color palette, visual texture, and the place depicted in the painting.

back to top  Multisensory Experience in the Art Gallery

After a six-month preparation process, Tate Sensorium was set up in a dedicated space within the Tate Britain art gallery. Figure 2 shows an overview on the space divided into four single cabinets hosting the four paintings respectively. The exhibition was designed so that four visitors could experience it at a time. Altogether, it lasted for about 15 minutes, with three minutes allocated to each of the four paintings. After entering the main door, a staffer welcomed visitors, who were then guided by the instructions given either through headphones (rooms 1, 3a, and 4) or speakers in the dedicated areas (rooms 2 and 3b).

At the entrance (marked 1 in Figure 2), visitors were instructed to put on the headphones to listen to a short introduction about Tate Sensorium, which briefly explained the intent behind the multisensory creation and how visitors might find their own interpretation for each artwork. Visitors also received a wristband to capture their skin conductance response, which was used to create a personalized printout at the end of the tour (this was supervised by Flying Objects).

After the introduction, visitors removed their headphones and continued walking to the room marked 2 in Figure 2. There, they experienced Interior II by Richard Hamilton. Visitors were instructed (through the speakers in the room) to experience the painting as naturally as possible and to move around the space to explore the three different scents. After that, staffers asked the visitors to separate into pairs (Pair 1 and Pair 2) and continue to the next painting (either 3a or 3b in Figure 2).

ins05.gif Figure 2. Room layout of the Tate Sensorium exhibition in the Tate Britain art gallery in London.

Pair 1 went to the room marked 3 a in Figure 2 and viewed Full Stop by John Latham. Following the audio guidance, one of them placed her hand into the empty space in the plinth to experience the mid-air haptic feedback integrated and synchronized with the sound, while the other visitor listened only to the sound while viewing the painting. The complete synchronized sound-haptic experience lasted for one minute, and then visitors were instructed through the headphones to swap positions so that the second person could experience the complete sound-haptic integration.

Pair 2 went to the room marked 3b in Figure 2 and experienced David Bomberg’s In the Hold together with the synchronized sound and scent stimuli. Visitors were instructed to walk around and pick up one of the two 3D-printed scent objects to enjoy the combined sound-scent composition. Then they changed positions and enjoyed each object and its distinct scent aligned with the specific auditory stimuli (as described above).

After Pair 1 finished experiencing room 3a and Pair 2 went through room 3b, they switched. Pair 1 moved on to room 3b and Pair 2 moved to room 3a, following the same procedure as described here for each of the two paintings.

All four visitors then moved to the final room, marked 4 in Figure 2. Each of them put on the headphones, from which they received the instructions for the final painting. They all stood in front of Francis Bacon’s Figure in a Landscape painting with a plinth in front of them that held four chocolate pralines. Visitors were instructed through headphones to pick up a piece of chocolate and eat it while experiencing the synchronized scent and sound stimuli.

back to top  Lessons and Benefit For HCI Research

Tate Sensorium was open to the public between August 25 and October 4, 2015. Within this timeframe, 4,000 visitors experienced the selected art pieces in a new and innovative way. We collected feedback from 2,500 visitors through questionnaires and conducted 50 interviews to capture the subjective experiences of gallery visitors. Around 87 percent of visitors rated the experience as very interesting (at least 4 on a 5-point Likert scale), and around 85 percent expressed an interest in returning to the art gallery for such multisensory experiences (see details on the findings in [6]).

Here, we reflect on our experience in the Tate Sensorium project. The degree of success of the venture depends on who you ask. From the point of view of the gallery, the results of Tate Sensorium exceeded initial expectations. The originally planned one-month exhibition period was extended for two additional weeks given the vast interest from the public. From the creative team’s point of view, it was also a success despite minor technical challenges in integrating the different sensory stimuli and coordinating a very diverse project team.

From an HCI research point of view, this project provided us with a unique opportunity to test and deploy an emerging technology in a real-world context. Additionally, it allowed us to explore novel tactile experiences, new tools, and ad-hoc workarounds to facilitate collaborating with sensory designers on integrating the different sensory stimuli, and evaluating the impact of the multisensory experiences. However, these opportunities come with practical challenges, such as constraints in the data-collection process at the gallery (e.g., limited time and methods), limited control over the artwork selection (e.g., constrained by the gallery’s collection), and the lack of control over conditions to establish a baseline experience to which to compare the different designs. Moreover, balancing the different stakeholders’ requirements and ideas for this kind of project can be challenging. Despite those challenges, Tate Sensorium allowed us to think beyond traditional art and museum experiences and encouraged our HCI research team to be more creative in how we design, develop, implement, and evaluate experiences.


We believe that Tate Sensorium is just the beginning of a new form of multisensory experience in art galleries and museums.


At the same time, the other stakeholders appreciated the scientific lenses we offered in the design process. While we continuously encouraged a discussion on different design choices along this process, in particular in the design of the sound-haptic experience, there is still an unknown level of reliance on the team’s “artistic instinct,” which cannot always be grasped or quantified. However, it is from the combination of new technological opportunities, scientific knowledge, and this instinct that innovative multisensory experiences may develop.

Overall, the exhibition attracted extensive publicity within the U.K. and worldwide (e.g., the BBC, Wired, and The Wall Street Journal [7]). It was also awarded the 2016 Design Week award in the exhibition category [7]. We believe that Tate Sensorium is just the beginning of a new form of multisensory experience in art galleries and museums. Moreover, it provides designers and developers with new prospects in the context of interactive media and public engagement. We are convinced that our understanding of multisensory art experiences and design, based on novel interactive technologies, can be advanced through projects like Tate Sensorium (in addition to field and lab-based studies). We hope that Tate Sensorium will inspire researchers and practitioners in the creative industry to explore new ways of engaging people and exploiting all human senses in the design of interactive experiences.

back to top  Acknowledgments

This work was supported by the IK Prize 2015 and by the EC within the Horizon2020 program through the ERC (Starting Grand Agreement 638605). Special thanks go to the Flying Object team, particularly to Peter Law and Tom Pursey, and the Tate Britain team led by Tony Guillan. We thank the whole team of Tate Sensorium: audio created by Nick Ryan; scents designed by Odette Toilette; chocolates created by Paul A. Young; lighting design by Cis O’Boyle; interactive theater design by Annette Mees; interface development by Make Us Proud and team. Special thanks go to the UltraHaptics team for their continued support with midair haptic development. Finally, we thank all visitors of Tate Sensorium at Tate Britain for their participation and feedback.

back to top  References

1. Obrist, M., Ranasinghe, N., and Spence, C. Special Issue: Multisensory Human-Computer Interaction. International Journal of Human-Computer Studies 107 (Nov. 2017), 1–4; https://doi.org/10.1016/j.ijhcs.2017.06.002

2. Seah, S.A., Martinez Plasencia, D., Bennett, P.D., Karnik, A., Otrocol, V.S., Knibbe, J., Cockburn, A., and Subramanian, S. SensaBubble: A chrono-sensory mid-air display of sight and smell. Proc. of the SIGCHI Conference on Human Factors in Computing Systems. ACM, New York, 2014, 2863–2872; https://doi.org/10.1145/2556288.2557087

3. Ranasinghe, N., Jain, P., Karwita, S., and Yi-Luen Do, E. Virtual Lemonade: Let’s teleport your lemonade! Proc. of the Eleventh International Conference on Tangible, Embedded, and Embodied Interaction. ACM, New York, 2017, 183–190; https://doi.org/10.1145/3024969.3024977

4. Obrist, M., Velasco, C., Vi, C., Ranasinghe, N., Israr, A., Cheok, A., Spence, C., and Gopalakrishnakone, P. Sensing the future of HCI: Touch, taste, and smell user interfaces. Interactions 23, 5 (Sept.–Oct. 2016), 40–49; https://doi.org/10.1145/2973568.

5. Obrist, M., Seah, S.A., and Subramanian, S. Talking about tactile experiences. Proc. of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 2013; https://doi.org/10.1145/2470654.2466220

6. Vi, C.T., Ablart, D., Gatti, E., Velasco, C., and Obrist, M. Not just seeing, but also feeling art: Mid-air haptic experiences integrated in a multisensory art exhibition. International Journal of Human-Computer Studies 108 (2017), 1–14. ISSN 1071-5819; http://dx.doi.org/10.1016/j.ijhcs.2017.06.004

7. BBC News article: http://www.bbc.co.uk/news/entertainment-arts-34049150; Wired UK article: http://www.wired.co.uk/article/tate-sensorium-review-2015; WSJ article: http://www.wsj.com/articles/please-touch-the-art-work-new-tate-exhibit-will-stimulate-all-five-senses-1439495457 Design Week Awards 2016: https://www.designweek.co.uk/issues/13-19-june-2016/design-week-awards-2016-winners-full/

back to top  Authors

Damien Ablart is a Ph.D. student in informatics at the University of Sussex working on the integration of touch and taste for interactive media. He created the multisensory (touch-sound) experience for Full Stop in the Tate Sensorium. His ambition is to support artistic creations of multisensory experiences. da292@sussex.ac.uk

Carlos Velasco is an assistant professor at BI Norwegian Business School, Oslo, Norway, and also a visiting research fellow at the Sussex Computer Human Interaction (SCHI) Lab at the University of Sussex. He has a special research interest in crossmodal correspondences and their applications to multisensory experience design. carlos.velasco@bi.no

Chi Thanh Vi is a postdoctoral research fellow at the Sussex Computer Human Interaction (SCHI) Lab. He is interested in using brain-sensing methods (for example, EEG and fMRI] to understand the neural basis of user states based on sensory stimuli to inform the design of brain-computer interfaces. c.vi@sussex.ac.uk

Elia Gatti is a postdoctoral research fellow at the Sussex Computer Human Interaction (SCHI) Lab who works to apply insights from psychophysics and computational neurosciences to HCI. He is particularly interested in communicating emotions through sensory stimuli (especially haptics) and object perception in virtual reality. elia.gatti1986@gmail.com

Marianna Obrist is a reader (associate professor) in informatics at the University of Sussex and head of the Sussex Computer Human Interaction (SCHI) Lab. Her vision and ambition are to gain a rich and integrated understanding of peoples’ tactile, gustatory, and olfactory experiences for interactive technology. m.obrist@sussex.ac.uk

back to top 

©2017 ACM  1072-5520/17/11  $15.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2017 ACM, Inc.

Post Comment


No Comments Found