Demo Hour

XX.1 January + February 2013
Page: 10
Digital Citation


Authors:
Anke Brock, Philippe Truillet, Bernard Oriola, Delphine Picard, Christophe Jouffrais, Götz Wintergerst, Ron Jagodzinski, Peter Giles, Sangwon Choi, Jiseong Gu, Jaehyun Han, Seongkook Heo, Sunjun Kim, Geehyuk Lee, Gloria Ronchi, Claudio Benghi

Kintouch

How do visually impaired people explore tactile images? Little is known about this. Yet, understanding this question would help us design interactive interfaces for visually impaired users that better satisfy their needs. The study of haptic exploration strategies in psychology usually relies on video observations, which are difficult and time-consuming. We introduce Kintouch, a prototype that tracks finger movements by integrating data from the Microsoft Kinect camera and a multitouch display. It registers location and timing of hand and digit movements and can thus help in analyzing haptic exploration strategies. Results of preliminary studies show this approach to be promising.

Project website: http://www.irit.fr/~Anke.Brock/Kintouch/

Publication: Brock, A., Lebaz, S., Oriola, B., Picard, D., Jouffrais, C., and Truillet, P. Kin’touch: Understanding how visually impaired people explore tactile maps. CHI’12 Extended Abstracts on Human Factors in Computing Systems. ACM, New York, 2471-2476.

Anke Brock | IRIT – CNRS & Université Toulouse | anke.brock@irit.fr

Philippe Truillet | IRIT – CNRS & Université Toulouse | philippe.truillet@irit.fr

Bernard Oriola | IRIT – CNRS & Université Toulouse | bernard.oriola@irit.fr

Delphine Picard | IUF & Université Toulouse | delphine.picard@univ-tlse2.fr

Christophe Jouffrais | IRIT – CNRS & Université Toulouse | christophe.jouffrais@irit.fr

ins01.gif

Hap.pen

Sketching is an important aspect of the design process; yet, it is not adequately represented within the realm of multimodal interface systems. With Hap.pen it is possible to sketch haptic behaviors by assigning them to graphical user elements. One tip of the two-sided pen acts as a stylus on any capacitive touchscreen and allows the user to draw graphical elements. In this process, haptical feedback is assigned to the brightness of the graphical elements. By moving the opposite tip over the screen, an internal color sensor detects the brightness values and the integrated actuator reproduces a corresponding haptical feedback.

Project website: http://www.hfg-gmuend.de/HapTECH_Entwicklung_haptisc….html

http://www.nuilab.com

Publication: Wintergerst, G. Jagodzinski, R., and Giles, P. Hap.pen: Sketching haptic behaviour. Mensch & Computer, Workshop Proceedings. 2011, 13-14.

Götz Wintergerst | Hochschule für Gestaltung Schwäbisch Gmünd | goetz.wintergerst@nuilab.com

Ron Jagodzinski | Hochschule für Gestaltung Schwäbisch Gmünd | ron.jagodzinski@nuilab.com

Peter Giles | Hochschule für Gestaltung Schwäbisch Gmünd | peter.giles@nuilab.com

ins02.gif

ScreenPad

ScreenPad is an optical hover-tracking touchpad first introduced in the CHI’11 paper, “RemoteTouch,” as a solution to enable continuous visual feedback on the TV screen about a user’s finger movement. It is now evolving into different forms and for different applications. ScreenPad2, in the form of a large smartphone, is enabling bi-manual operations as well as thumb-based TV remote operations. ScreenPad2 was also shown to be useful for the laptop, enabling area-based gestures. ScreenPad3, with a long form factor and a longer hover-tracking range, is currently under development, aiming to replace the whole area below the laptop keyboard.

Project website: http://hcil.kaist.ac.kr/project/197600

Publication: Choi, S., Han, J., Lee, G., Lee, N., and Lee, W. RemoteTouch: Touch-screen-like interaction in the TV viewing environment. Proc. of CHI 2011. ACM, New York, 393-402.

Choi, S., Han, J., Kim, S., Heo, S., and Lee, G. ThickPad: A hover-tracking touchpad for a laptop. Proc. of UIST 2011 (demo). ACM, New York, 15-16.

Sangwon Choi, Jiseong Gu, Jaehyun Han, Seongkook Heo, Sunjun Kim, and Geehyuk Lee | KAIST | geehyuk@gmail.com

ins03.gif

Heart_Bit Lamp

Aether & Hemera’s Heart_Bit is an interactive lamp that provides not only an ambient white light but also an emotional red accent that triggers at the exact same rhythm as the user’s pulse. When you touch the sensor your heartbeat is transformed into lighting effects. People can experience their heartbeats and see their pulses represented through “choreographic light” in an attempt to make visible something that is normally only audible—to make it more public and to share it. The aim is to provoke emotions and memories, elicit empathic connections, and uncover perceptions, embodying them in a different medium.

Project website:

http://www.aether-hemera.com/Work/Detail/HeartBitLamp

http://vimeo.com/38769189

Gloria Ronchi | Aether & Hemera | hemera@aether-hemera.com

Claudio Benghi | Northumbria University | claudio.benghi@northumbria.ac.uk

ins04.gif

©2013 ACM  1072-5220/13/01  $15.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2013 ACM, Inc.

Post Comment


*

Comments submitted to this site are moderated and will appear if they are relevant to the topic.


No Comments Found