Abstract

In this paper, an face-tracking camera-based assistive technology is presented for extracting head and face based gestures which are delivered to individuals who are blind. The interface is capable of tracking the head and face of an interaction partner and delivering them through the recently developed VibroGlove interface. The construction of the interface as well as its application are detailed.

Original languageEnglish (US)
Title of host publicationHAVE 2010 - 2010 IEEE International Symposium on Haptic Audio-Visual Environments and Games, Proceedings
Number of pages1
DOIs
StatePublished - Dec 30 2010
Event2010 9th IEEE International Symposium on Haptic Audio-Visual Environments and Games, HAVE 2010 - Phoenix, AZ, United States
Duration: Oct 16 2010Oct 17 2010

Publication series

NameHAVE 2010 - 2010 IEEE International Symposium on Haptic Audio-Visual Environments and Games, Proceedings

Other

Other2010 9th IEEE International Symposium on Haptic Audio-Visual Environments and Games, HAVE 2010
CountryUnited States
CityPhoenix, AZ
Period10/16/1010/17/10

ASJC Scopus subject areas

  • Computer Graphics and Computer-Aided Design
  • Human-Computer Interaction
  • Software

Fingerprint Dive into the research topics of 'Dyadic interaction assistant for tracking head gestures and facial expressions'. Together they form a unique fingerprint.

  • Cite this

    Bala, S., Ramesh, V., Krishna, S., & Panchanathan, S. (2010). Dyadic interaction assistant for tracking head gestures and facial expressions. In HAVE 2010 - 2010 IEEE International Symposium on Haptic Audio-Visual Environments and Games, Proceedings [5623964] (HAVE 2010 - 2010 IEEE International Symposium on Haptic Audio-Visual Environments and Games, Proceedings). https://doi.org/10.1109/HAVE.2010.5623964