Demo Hour

XXII.2 March + April 2015
Page: 6
Digital Citation


Authors:
Max Rheiner, Thomas Tobler, Fabian Troxler, Seki Inoue, Keisuke Hasegawa, Yasuaki Monnai, Yasutoshi Makino, Hiroyuki Shinoda, Jules Françoise, Norbert Schnell, Riccardo Borghesi, Frédéric Bevilacqua, Tuncay Cakmak, Holger Hager

back to top  1. Birdly

Birdly is an installation that explores the experience of a bird in flight. It tries to capture the mediated flying experience with several methods. Unlike a common flight simulator, you do not control a machine—you embody a bird. To evoke this embodiment, we rely mainly on the sensory-motor coupling. The participant can control the simulator with his hands and arms, which directly correlate to the wings and primary feathers of a bird. The scenery is perceived in the first-person perspective of a bird. To intensify the embodiment, we include additional sonic, olfactory, tactile, and wind feedback.

http://birdly.zhdk.ch

https://vimeo.com/91069214

https://vimeo.com/104858339

Max Rheiner, Zurich University of the Arts
[email protected]

Thomas Tobler, Zurich University of the Arts
[email protected]

Fabian Troxler, Zurich University of the Arts
[email protected]

ins01.jpg Birdly allows users to embody a flying bird.

back to top  2. HORN—Ultrasound Airborne Volumetric Haptic Display

Interaction with mid-air floating virtual objects expands human-computer interface possibilities. Here, we propose a system that superimposes haptic volumetric sensations on mid-air floating images by using acoustic potential distribution.

Our surrounding phased-array system freely produces 3D spatial patterns of ultrasonic standing waves, which create various feelings of elastic and textured surfaces. The ultrasound does not affect the optical images and can be controlled quickly in this interactive system. The combination of 3D volumetric vision and this haptic technology flexibly displays the presence of 3D objects that can be pinched, handled, and manipulated.

http://www.hapis.k.u-tokyo.ac.jp/?portfolio=english-horn-hapt-optic-reconstruction&lang=en

https://www.youtube.com/watch?v=7IbdvOrtiDE

Inoue, S., Kobayashi, K., Monnai, Y., Hasegawa, K., Makino, Y., and Shinoda, H. HORN: The hapt-optic reconstruction. Proc. of SIGGRAPH 2014, Emerging Technologies. ACM, New York, 2014, Article 11.

Seki Inoue, The University of Tokyo
[email protected]

Keisuke Hasegawa, The University of Tokyo
[email protected]

Yasuaki Monnai, The University of Tokyo
[email protected]

Yasutoshi Makino, The University of Tokyo
[email protected]

Hiroyuki Shinoda, The University of Tokyo
[email protected]

ins02.jpg

back to top  3. MaD: Mapping by Demonstration for Continuous Sonification

MaD allows for simple and intuitive design of continuous sonic gestural interaction. When movement and sound examples are jointly recorded, the system automatically learns the motion-sound mapping. Our applications focus on using vocal sounds—recorded while performing actions—as the primary material for interaction design. The system integrates probabilistic models with hybrid sound synthesis. Importantly, the system operates independently of motion-sensing devices, and can be used with different sensors such as cameras, contact microphones, and inertial measurement units. Applications include not only performing arts and gaming but also medical applications such as auditory-aided rehabilitation.

http://ismm.ircam.fr/siggraph2014-mad/

http://vimeo.com/julesfrancoise/mad

Françoise, J., Schnell, N., and Bevilacqua, F. A multimodal probabilistic model for gesture-based control of sound synthesis. Proc. of the 21st ACM International Conference on Multimedia. ACM, New York, 2013, 705–708. DOI:10.1145/2502081.2502184

Françoise, J., Schnell, N., Borghesi, R., and Bevilacqua, F. Probabilistic models for designing motion and sound relationships. Proc. of the 2014 International Conference on New Interfaces for Musical Expression. 2014, 287–292.

Jules Francoise, IRCAM.CNRS.UPMC
[email protected]

Norbert Schnell, IRCAM.CNRS.UPMC
[email protected]

Riccardo Borghesi, IRCAM.CNRS.UPMC
[email protected]

Frederic Bevilacqua, IRCAM.CNRS.UPMC
[email protected]

ins03.jpg

back to top  4. Cyberith Virtualizer

The Virtualizer is an easy-to-use virtual reality device that allows the user to walk through any kind of virtual environment in real time. It does so by combining a low-friction principle and high-precision sensors with a special mechanical construction, resulting in a new form of omnidirectional treadmill.


Walk through any kind of virtual environment in real time.


http://www.cyberith.com

https://www.youtube.com/watch?v=dVvYfonQJpk

https://www.youtube.com/watch?v=bgblE3nxvNg

Tuncay Cakmak, Cyberith
[email protected]

Holger Hager, Cyberith
[email protected]

ins04.jpg

back to top 

©2015 ACM  1072-5520/15/0300  $15.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2015 ACM, Inc.

Post Comment


No Comments Found