Demo Hour

XXIII.1 January + February 2016
Page: 8
Digital Citation


Authors:
Hayeon Jeong, Daniel Saakes, Geehyuk Lee, Augusto Esteves, Eduardo Velloso, Andreas Bulling, Katsutoshi Masai, Yuta Sugiura, Masa Ogata, Kai Kunze, Masahiko Inami, Maki Sugimoto, Anura Rathnayake, Tilak Dias

back to top  1. I-Eng: A Toy for Second-Language Learning

I-Eng is an interactive toy set that aims to teach new languages to young children between the ages of three and five. The toy consists of a talking plush doll that interacts with tagged objects. The doll speaks sentences related to nearby objects and, depending on the context, can ask the child for other related objects. This allows children to practice both active and passive vocabulary. Through interaction with these tangible objects, an unscripted narrative unfolds. Children are thus naturally exposed to the foreign language and can have a playful "learning by doing" experience.

http://mid.kaist.ac.kr/projects/i_eng/

https://vimeo.com/138178841

Jeong, H., Saakes, D.P., Lee, U. I-Eng: An interactive toy for second language learning. Adjunct Proc. of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing. ACM, New York, 2015.

Hayeon Jeong, KAIST,
[email protected]

Daniel Saakes, KAIST

Uichin Lee, KAIST

ins01.gif I-Eng consists of a talking plush doll that interacts with tangible object toys.
ins02.gif A boy plays with I-Eng. When he presents a tagged object to the talking doll, it reacts by speaking an appropriate sentence.

back to top  2. Orbits: Gaze Interaction for Smart Watches

Orbits is a gaze interaction technique that enables hands-free input on smart watches, accounting for the limited display space of these devices. The technique uses moving controls to leverage "smooth pursuit" eye movements, thus detecting which control the user is looking at. Each target performs a distinct function and can be activated by following it with the eyes, allowing for both discrete and continuous control. Because our approach relies on the relative movement of the eyes, no calibration between the eye tracker and the display is necessary.

http://www.mysecondplace.org/orbits/

https://www.youtube.com/watch?v=x6hibicxEFbg

Esteves, A., Velloso, E., Bulling, A., and Gellersen, H.

Orbits: Gaze interaction for smart watches using smooth pursuit eye movements.

Proc. of the 28th Annual ACM Symposium on User Interface Software and Technology. ACM, New York, 2015.

Augusto Esteves, Lancaster University,
[email protected]

Eduardo Velloso, Lancaster University

Andreas Bulling, Max Planck Institute for Informatics Hans Gellersen, Lancaster University

ins03.gif A user interacting with a missed call menu on a smart watch using gaze input. The UI shows four Orbits controls that allow the user to call or text back to store the number or to clear the missed call notification. Gaze input is captured though a head-mounted eye-tracker.

back to top  3. AffectiveWear

This eyewear system detects facial expressions. Using proximity sensing, photo-reflective sensors measure the distance between the eyewear frame and the surface of the user's face. This distance changes as the facial muscles move to create different expressions. Detecting and recording these changes helps users understand more about their unintentional non-verbal clues. For example, users suffering from depression or other mental disorders can measure whether their state is improving based on changes in their facial expressions.

In addition, AffectiveWear allows users to use facial expressions to change typography in text messages, or to add emoticons. The system can also display users' facial expressions on avatars to achieve more natural, subtle communication in virtual worlds.

http://im-lab.net/affectivewear/

https://www.youtube.com/watch?v=9PMzpsDg518

Masai, K., Sugiura, Y., Ogata, M., Kunze, K., Inami, M., and Sugimoto, M. AffectiveWear: Smart eye glasses to recognize facial expressions. Ubicomp 2015 Demo.

Masai, K., Sugiura, Y., Ogata, M., Suzuki, K., Nakamura, F., Shimamura, S., Kunze, K., Inami, M., and Sugimoto, M.

AffectiveWear: Toward recognizing facial expression. ACM SIGGRAPH 2015 Posters. ACM, New York, 2015.

Katsutoshi Masai, Keio University
[email protected]

Yuta Sugiura, National Institute of Advanced Industrial Science and Technology (AIST)

Masa Ogata, National Institute of Advanced Industrial Science and Technology (AIST)

Kai Kunze, Keio University

Masahiko Inami, Keio Media Design

Maki Sugimoto, Keio University

ins04.gif AffectiveWear glasses can detect a user's facial expressions.
ins05.gif AffectiveWear glasses.

back to top  4. Yarns with Embedded Electronics

The goal of this research was to develop the core technology for embedding semiconductor micro devices within the fibers of yarns in order to craft novel electronically active yarn (EAY). Such smart yarns will be the building blocks of the next generation of wearable electronics. They will help solve the current problems that manufacturers of wearable textiles are experiencing and open the door for designers to develop the next generation of truly wearable computers that are more comfortable, flexible, and washable. Applications include medicine, sports science, automobiles, the military, fashion design, retail, and manufacturing.

www.facebook.com/NTUAdvancedTextiles

https://ntuadvancedtextiles.wordpress.com/

@advancedtextile

https://www.youtube.com/watch?v=PbLcpge7Hyk

Rathnayake, A. and Dias, T. Electronically active smart textiles. Research and the Researcher, Research Practice Course Fifth Annual Conference, Nottingham Trent University, 2013.

Dias, T. and Rathnayake, A. Integration of micro-electronics with yarns for smart textiles (Chapter 5). In T. Dias., Electronic Textiles Smart Fabrics and Wearable Technology. Woodhead, Nottingham, 2015.

Anura Rathnayake, Nottingham Trent University,
[email protected]

Tilak Dias, Nottingham Trent University

ins06.gif RFID yarn alone and integrated into garments.
ins07.gif LED yarns with applied voltage.

back to top 

©2016 ACM  1072-5220/16/01  $15.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2016 ACM, Inc.

Post Comment


No Comments Found