Demo Hour

XVIII.4 July + August 2011
Page: 8
Digital Citation


Authors:
Ray Yun, Mark Gross, Dan Newton, Mark Marshall, Andrew Stevenson, Christopher Perez, Roel Vertegaal, Ayumi Kawakami, Koji Tsukada, Keisuke Kambara, Itiro Siio


PotPet




RayMatic, an Ambient Interactive Picture Frame




The Augmentalist: Enabling Musicians to Develop Augmented Musical Instruments



RayMatic: An Emotionally Expressive Meter with a Human Face

Can interfaces that use facial expressions and gestures make human-computer communication more friendly and engaging? RayMatic, an emotionally expressive meter, looks like a framed photograph. Whereas traditional meters display numbers or graphs statically, the person in RayMatic’s frame (Ray) responds interactively to users, conveying environmental information from the connected sensors with facial expressions and gestures. For example, Ray fans himself when it is too hot or covers his ears and frowns when it is too noisy. The photo is displayed on a touchscreen so you can change set points and thresholds, and also play with Ray by poking his photo.

Project website: http://code.arc.cmu.edu/projects/raymatic/

Publication:

Yun, R. and Gross, M.D. The RayMatic: A thermostat with a human face. Proc. of the 5th International Conference on Tangible, Embedded, and Embodied Interaction (Funchal, Portugal, Jan. 23–26). ACM, New York, 2011, 261–262.

Yun, R. and Gross, M.D. RayMatic: Ambient meter display with facial expression and gesture. Proc. of the 29th International Conference on Human Factors in Computing Systems (Vancouver, Canada, May 7–12). ACM, New York, 2011.

ins01.gif

* Authors

Ray Yun | Carnegie Mellon University | ryun@andrew.cmu.edu
Mark D. Gross | Carnegie Mellon University | mdgross@cmu.edu

The Augmentalist

The Augmentalist is a system that allows musicians to augment their existing musical instruments with a variety of sensors. It focuses on musicians and their musical aims, rather than on the sensor technology being used. The overall goal is to allow musicians to easily explore new performance techniques with augmented musical instruments. The sensors can sense various performance gestures, such as moving the neck of a guitar up and down, and musicians can use these gestures to control audio effects or create new sounds. This opens up new performance techniques and creates opportunities for further exploration of augmented musical instrument performance practice.

Project website: http://big.cs.bris.ac.uk/projects/musicinterfaces

Publication:

Newton, D. and Marshall, M.T. The Augmentalist: Enabling musicians to develop augmented musical instruments. Proc. of the 5th International Conference on Tangible, Embedded, and Embodied Interaction (Funchal, Portugal, Jan. 23–26). ACM, New York, 2011, 249–252.

ins02.gif

* Authors

Dan Newton | University of Bristol | djslylogic@googlemail.com
Mark T. Marshall | University of Bristol | mark@cs.bris.ac.uk

Pneuma: A Dynamically Inflatable Hemispherical Multi-touch Display

Pneuma is a dynamic display that uses the elastic properties of latex to explore the entanglement of form and function. With the power of air pressure it can change from a flat screen to a dome or bowl shape depending on the information being displayed. For example, a view of the Earth from far away will be presented on a hemispherical display, and as you zoom in it will deflate to a flat screen. The latex material allows you to press into the screen, deforming it in unusual ways and adding a z-axis component not possible with rigid screens.

Project website: http://www.hml.queensu.ca/

Publication:

Stevenson, A., Perez, C., and Vertegaal, R. An inflatable hemispherical multi-touch display. Proc. of the 5th International Conference on Tangible, Embedded, and Embodied Interaction (Funchal, Portugal, Jan. 23–26). ACM, New York, 2011, 289–292.

ins03.gif

* Authors

Andrew Stevenson | Queen’s University | andrews@cs.queensu.ca
Christopher Perez | Queen’s University | perez@cs.queensu.ca
Roel Vertegaal | Queen’s University | roel@cs.queensu.ca

PotPet: Pet-like Flowerpot Robot

PotPet is a flowerpot-type robot that helps people grow plants more effectively and enjoyably. PotPet acts autonomously, as pets do—for example, it automatically moves to sunny places and approaches people when it requires water. PotPet consists of a real plant, several sensors to detect plant status, a robot with wheels for mobility, and a microcontroller to control the above devices. We plan to extend the behavior of PotPet to respond to the types, sizes, and growth phases of the real plants attached to it, and will install PotPet in the garden of an actual house to evaluate its effectiveness.

Project website: http://orange.siio.jp/~ayumi/en/potpet.html

Publication:

Kawakami, A., Tsukada, K., Kambara, K., and Siio, I. PotPet: Pet-like flowerpot robot. Proc. of the 5th International Conference on Tangible, Embedded, and Embodied Interaction (Funchal, Portugal, Jan. 23–26). ACM, New York, 2011, 263–264.

ins04.gif

* Authors

Ayumi Kawakami | Ochanomizu University | kawakami.ayumi@is.ocha.ac.jp
Koji Tsukada | Ochanomizu University and JST PRESTO | tsuka@acm.org
Keisuke Kambara | Ochanomizu University | kambara@sappari.org
Itiro Siio | Ochanomizu University | siio@acm.org

Footnotes

These demo projects were curated especially for interactions by Florian “Floyd” Mueller and Vassilis Kostakos from papers presented at ACM’s 5th International Conference on Tangible, Embedded, and Embodied

Interaction (TEI ‘11) in Funchal, Portugal. Mueller is a Fulbright Visiting Scholar at Stanford University; Kostakos is an assistant professor in the Madeira Interactive Technologies Institute at the University of Madeira.

©2011 ACM  1072-5220/11/0700  $10.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2011 ACM, Inc.

Post Comment


No Comments Found