AffectiveWear
Issue: XXIII.1 January + February 2016Page: 8
Digital Citation
Authors:
Katsutoshi Masai, Yuta Sugiura, Masa Ogata, Kai Kunze, Masahiko Inami, and Maki Sugimoto,
This eyewear system detects facial expressions. Using proximity sensing, photo-reflective sensors measure the distance between the eyewear frame and the surface of the user's face. This distance changes as the facial muscles move to create different expressions. Detecting and recording these changes helps users understand more about their unintentional non-verbal clues. For example, users suffering from depression or other mental disorders can measure whether their state is improving based on changes in their facial expressions.
In addition, AffectiveWear allows users to use facial expressions to change typography in text messages, or to add emoticons. The system can also display users' facial expressions on avatars to achieve more natural, subtle communication in virtual worlds.
http://im-lab.net/affectivewear/
Masai, K., Sugiura, Y., Ogata, M., Kunze, K., Inami, M., and Sugimoto, M. AffectiveWear: Smart eye glasses to recognize facial expressions. Ubicomp 2015 Demo.
Masai, K., Sugiura, Y., Ogata, M., Suzuki, K., Nakamura, F., Shimamura, S., Kunze, K., Inami, M., and Sugimoto, M.
AffectiveWear: Toward recognizing facial expression. ACM SIGGRAPH 2015 Posters. ACM, New York, 2015.
Katsutoshi Masai, Keio University
[email protected]
Yuta Sugiura, National Institute of Advanced Industrial Science and Technology (AIST)
Masa Ogata, National Institute of Advanced Industrial Science and Technology (AIST)
Kai Kunze, Keio University
Masahiko Inami, Keio Media Design
Maki Sugimoto, Keio University