Demo Hour

XX.5 September + October 2013
Page: 8
Digital Citation

Javier Quevedo-Fernández, J. Martens, John Hansen, Wang Wusheng, Irina Shklovski, Jari Varsaluoma, Ville Kentta, Alexandre Alapetite, John Hansen, I. MacKenzie


idAnimate is a general-purpose animation sketching tool for multitouch interfaces, in particular for the iPad. idAnimate allows its users to create meaningful and expressive animations within minutes or seconds. Animations are created by simply moving, scaling, and rotating the objects on the screen with the tips of the fingers. The system records the changes made on the objects, and is able to replay them at a later stage. It is easy to incrementally build complex animations using the record-while-playback technique.

We are currently setting up a longitudinal study with the tool. People interested in participating please refer to the website.

Project website:

Publications: Quevedo-Fernández, J., Schouren, J.M., and Martens, J.B.O.S. On the development of idShare, a platform to support interaction design activities of small co-located teams. International Workshop on Designing Collaborative Interactive Spaces for eCreativity, eLearning, eScience (DCIS’12). Quevedo-Fernández, J. and Martens, J.B.O.S. Demonstrating idAnimate: A multi-touch system for sketching and modifying animations. Proc. NordiCHI 2012. ACM, New York, 767-768.

Javier Quevedo-Fernández | Eindhoven University of Technology |

J.B.O.S Martens | Eindhoven University of Technology |



TalkingBadge is a Bluetooth platform for indoor location-based audio messaging, supporting zone-specific information retrieval and one-way text-to-speech paging via smartphones or a TalkingBadge piece of hardware that the user might carry with them. When people walk through a zone covering up to 50 meters, they can listen to short audio messages sent to them. The platform provides zone-based tracking in a low-cost fashion, which makes large-scale indoor deployment feasible for a range of locations, including airports, shopping malls, and hospitals. The platform design also addresses location-based services’ privacy issues by using episodic rather than continuous tracking through the defined zones.

Project website:

Publications: QHansen, J.P., Glenstrup, A.J., Wusheng, W., Weiping, L., and Zhonghai, W. Collecting location-based voice messages on a TalkingBadge. Proc. NordiCHI 2012. ACM, New York, 219-227. Hansen, J., Alapetite, A., Andersen, H., Malmborg, L., and Thommesen, J. Location-based services and privacy in airports. Human-Computer Interaction–INTERACT 2009 (2009), 168-181.

John Paulin Hansen | IT University of Copenhagen |

Wang Wusheng | Peking University |

Irina Shklovski | IT University of Copenhagen |



DrawUX is a Web-based research tool for retrospective long-term user experience evaluation, especially in remote studies. Users sketch a curve and add text comments to report how their experience of a product or service has changed over time. DrawUX provides both quantitative and qualitative data about users’ experiences during long-term use of a product or service.

DrawUX is developed in the Delightful Long-Term User Experience project (DELUX) in the Unit of Human-Centered Technology at Tampere University of Technology. The project is funded by the Finnish Funding Agency for Technology and Innovation (Tekes) and collaborating companies.

Project website:

Publication: Varsaluoma, J. and Kentta, V. DrawUX: Web-based research tool for long-term user experience evaluation. Proc. NordiCHI 2012. ACM, New York, 2012, 769-770.

Jari Varsaluoma | Tampere University of Technology |

Ville Kentta | Tampere University of Technology |


Gaze-Controlled Flying

Development of a control paradigm for unmanned aerial vehicles (UAV) is a new challenge to HCI. The demo explores how to use gaze as input for locomotion in 3-D. A low-cost drone is controlled by tracking the user’s point of regard (gaze) on a live video stream from the UAV. The drone will fly in the direction that the person is looking. Our approach relies on a direct feedback loop with no visible interface components displayed. The system tracks the point of regard as the user, who is situated in a control room, observes the streaming video to continuously adjust the locomotion.

Project website: Publication: Tall, M., Alapetite, A., San Agustin, J., Skovsgaard, H.H.T., Hansen, J.P., Hansen, D.W., and Møllenbach, E. Gaze-controlled driving. Proc. CHI 2009 Extended Abstracts. ACM, New York, 2009, 4387-4392.

Alexandre Alapetite | Technical University of Denmark |

John Paulin Hansen | IT University of Copenhagen |

I. Scott MacKenzie | York University |



The Demo program at the last ACM NordiCHI conference allowed attendees to directly interact with and experience novel interactive systems. Demonstrations were reviewed based on relevance, technological quality, and creativity. The projects shown here were selected by Eve Hoggan, researcher in the UIx group at the Helsinki Institute for Information Technology (HIIT) and post-doctoral researcher at the University of Helsinki; Mikkel R. Jakobsen, assistant professor in the Department of Computer Science at the University of Copenhagen; and Morten Fjeld, professor at Chalmers University of Technology and head of t2i interaction laboratory (

©2013 ACM  1072-5220/13/09  $15.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2013 ACM, Inc.

Post Comment

No Comments Found