Abstract

This paper presents a haptic interface for augmenting human-human interpersonal interactions by delivering facial expressions of an interaction partner to an individual using a visual-to-tactile mapping of facial action units. Pancake shaftless vibration motors are mounted on the back of a chair to provide vibrotactile stimulation in the context of a dyadic (one-on-one) interaction across a table. This work explores the design of spatiotemporal vibration patterns that can be used to convey the basic building blocks of facial movements according to the Facial Action Unit Coding System. A behavioral study was conducted to explore the factors that influence the naturalness of conveying affect using vibrotactile cues.

Original languageEnglish (US)
Title of host publication2014 IEEE International Symposium on Haptic, Audio and Visual Environments and Games, HAVE 2014 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages82-87
Number of pages6
ISBN (Electronic)9781479959631
DOIs
StatePublished - Nov 12 2014
Event2014 IEEE International Symposium on Haptic, Audio and Visual Environments and Games, HAVE 2014 - Richardson, United States
Duration: Oct 10 2014Oct 11 2014

Publication series

Name2014 IEEE International Symposium on Haptic, Audio and Visual Environments and Games, HAVE 2014 - Proceedings

Other

Other2014 IEEE International Symposium on Haptic, Audio and Visual Environments and Games, HAVE 2014
CountryUnited States
CityRichardson
Period10/10/1410/11/14

Keywords

  • Affective haptics
  • Assistive technology
  • Bilateral interpersonal interaction
  • Facial expressions
  • Social assistive aids
  • Vibrotactile stimulation

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Computer Vision and Pattern Recognition
  • Human-Computer Interaction

Fingerprint Dive into the research topics of 'Visual-to-tactile mapping of facial movements for enriched social interactions'. Together they form a unique fingerprint.

  • Cite this

    Bala, S., McDaniel, T., & Panchanathan, S. (2014). Visual-to-tactile mapping of facial movements for enriched social interactions. In 2014 IEEE International Symposium on Haptic, Audio and Visual Environments and Games, HAVE 2014 - Proceedings (pp. 82-87). [6954336] (2014 IEEE International Symposium on Haptic, Audio and Visual Environments and Games, HAVE 2014 - Proceedings). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/HAVE.2014.6954336