Demo Hour

XXVI.2 March - April 2019
Page: 10
Digital Citation


Authors:
Frederik Westergård, Jonathan Komang-Sønderbek, Malthe Blichfeldt, Jonas Fritsch, Tiffany Wun, Claire Mikalauskas, Kevin Ta, Joshua Horacsek, Lora Oehlberg, Daisuke Uriu, William Odom, Mei-Kei Lai, Masahiko Inami, Harvey Bewley, Laurens Boer

back to top 

For this issue, Erik Grönvall and Peter Hasdell selected four demos from the DIS 2018 Demo session that took place in the School of Design at the Hong Kong Polytechnic University. Each work explores what a robot or interactive system is and how it can be used, asking whether and how technology can have a role in intimate, private contexts such as funeral rituals, and what may happen if we start to augment and facilitate interactions with trees and other natural phenomena.

Erik Grönvall, Peter Hasdell (DIS 2018 Demo Chairs), and Anne Spaa

back to top  1. The Living Tree

The Living Tree is an interactive sound installation to explore the life of trees. To design an immersive and affectively engaging interaction, we deployed surface transducers in a forest in Denmark. The transducers emit vibrations through any material you press them against, turning the material into a speaker. When you place your ear onto the trunk of the tree, you hear the sounds of water running through its wooden veins combined with a heavy pulse and breathing. Through the sonic interaction, people might form new affective attachments to trees as living creatures with personality, character, and expression.

Blichfeldt, M.E., Komang-Sønderbek, J., Westergård, F.H., and Fritsch, J. The Living Tree: Using surface transducers to explore the secret life of trees through sonic interactions. Proc. of the 2018 ACM Conference Companion Publication on Designing Interactive Systems. ACM, New York, 2018, 327–330; https://doi.org/10.1145/3197391.3205398

https://vimeo.com/228964322 (The Living Tree)

https://vimeo.com/229704732 (Micro Habitat)

Frederik Højlund Westergård, Jonathan Komang-Sønderbek, Malthe Emil Blichfeldt, and Jonas Fritsch, IT University of Copenhagen
frit@itu.dk

ins01.gif The installation was tested in a real-life setting in Denmark and created in collaboration with the Danish Nature Agency.
ins02.gif The soundscape is activated and changes depending on your movements around the tree based on readings from a Kinect sensor.

back to top  2. Robot Improv Puppet Theatre

Robot Improv Puppet Theatre [RIPT] is an improvised theater experience centered around an Arduino Braccio robot, Pokey. Pokey performs gestures and dialogue in short-form improv scenes based on audience input from a mobile device. The robot’s performance is based entirely on audience participation, providing audience members an opportunity to see and hear their suggestions performed mid-scene. During a performance, a chosen audience member presses a GO button to randomly select and play back an entry. The human performers, acting alongside Pokey using full-body gestures, must then creatively interpret the robot’s dialogue and actions to construct new storylines.

Wun, T., Mikalauskas, C., Ta, K., Horacsek, J., and Oehlberg, L. RIPT: Improvising with an audience-sourced performance robot. Proc. of the 2018 ACM Conference Companion Publication on Designing Interactive Systems. ACM, New York, 2018, 323–326; https://doi.org/10.1145/3197391.3205397

Mikalauskas, C., Wun, T., Ta, K., Horacsek, J., and Oehlberg, L. Improvising with an audience-controlled robot performer. Proc. of the 2018 Designing Interactive Systems Conference. ACM, New York, 2018, 657–666; https://doi.org/10.1145/3196709.3196757

https://cmikalauskas.com/portfolio/performing-with-an-improv-robot/

https://youtu.be/h3mkdO5GWc4

Tiffany Wun, Claire Mikalauskas, Kevin Ta, Joshua Horacsek, and Lora Oehlberg, University of Calgary
twwun@ucalgary.ca

ins03.gif Audience member submitting dialogue and selecting a gesture for Pokey before a show.
ins04.gif The performer interacts with Pokey onstage by responding in the improvised scene.

back to top  3. SenseCenser

SenseCenser is a device for burning incense. It senses when incense chips are placed into it and the volume of incense smoke that is produced as the chips burn. Data captured from the integrated sensors can be connected to various applications, such as lighting equipment, sound systems, displays showing moving images, and more specific installations in particular rituals. We designed SenseCenser to investigate the potential role and place of interactive technologies in supporting Japanese funeral and memorialization rituals, as well as how it could be applied to other incense practices in different cultural contexts and settings.

Uriu, D., Odom, W., Lai, M., Taoka, S., and Inami, M. SenseCenser: An interactive device for sensing incense smoke and supporting memorialization rituals in Japan. Proc. of the 2018 ACM Conference Companion Publication on Designing Interactive Systems. ACM, New York, 2018, 315–318; https://doi.org/10.1145/3197391.3205394

https://vimeo.com/273809393

Daisuke Uriu, The University of Tokyo
uriu@star.rcast.u-tokyo.ac.jp

William Odom, Simon Fraser University
wodom@sfu.ca

Mei-Kei Lai, Macao Polytechnic Institute
mklai@ipm.edu.mo

Masahiko Inami, The University of Tokyo
drinami@star.rcast.u-tokyo.ac.jp

ins05.gif Smoke from incense chips burned on the electric heater inside the Censer.
ins06.gif A SenseCenser installation showing a photo of the deceased for a memorialization ritual.

back to top  4. Social Robotic Donuts

In order to broaden the design space for social robot design, we created two air-actuated donut-shaped social robots: a small uniform silicone version and a more organic latex version. Our intent was to move away from typical human or animal form mimicry, focusing instead on elastic expression, ambiguous form, and playful behaviors. Early studies show the robots provoking curious relationships with these contestational objects, inviting new forms of intriguing communication and interaction.

Boer, L. and Bewley, H. Reconfiguring the appearance and expression of social robots by acknowledging their otherness. Proc. of the 2018 Designing Interactive Systems Conference. ACM, New York, 2018, 667–677; https://doi.org/10.1145/3196709.3196743

Bewley, H and Boer, L. Designing Blo-nut: Design principles, choreography and otherness in an expressive social robot. Proc. of the 2018 Designing Interactive Systems Conference. ACM, New York, 2018, 1069–1080; https://doi.org/10.1145/3196709.3196817

https://ixdlab.itu.dk/

Harvey Bewley and Laurens Boer, IxD Lab, IT University Copenhagen
laub@itu.dk

ins07.gif Hardware and software setup for programming the elastic expressions of the robotic donuts.
ins08.gif Participants exploring possible interactions with a silicon robotic donut.

back to top 

©2019 ACM  1072-5520/19/03  $15.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2019 ACM, Inc.

Post Comment


No Comments Found