HCC: Small: Assistive Social Situational Awareness Aids for Individuals with Disabilities

Project: Research project

Description

1. Project Background Social interactions are an essential component of both personal and professional growth among individuals. While 35% of our interpersonal interactions are verbal speech, nearly 65% of communication happens through non-verbal cues (such as eye gaze, body mannerisms and hand gestures) [1] [2], which are inaccessible to individuals with visual impairments. This inhibition impedes with the individuals ability to make appropriate judgments leading to a sense of social isolation or diminished returns in any social exchange. From learning social skills to utilizing them, individuals who are blind face fundamental social challenges in coping with society. This project addresses this fundamental challenge that is largely unexplored. In particular, this project will pioneer the development of an experiential technology solution that provides real-time access to nonverbal communication cues in dyadic (one-on-one) interactions, and delivers these cues to users through next-generation haptic interfaces (See Figure 1). The project is being collaboratively implemented by experts at Arizona State University (ASU) and University of South Florida (USF) in ubiquitous and embodied multimedia computing, human-computer interfaces, haptics, machine intelligence (PI Panchanathan, ASU); computer vision, machine intelligence and assistive technologies (Balasubramanian, ASU); human communication studies, non-verbal communication (Ramirez, USF); usability engineering, disability studies (Hedgpeth, ASU); and a caregiving institution for individuals with visual impairments (Arizona Center for the Blind and Visually Impaired). The research objectives of this project focus on four important components: (1) dyadic mediation interface (See Figure 2 for ongoing work); (2) suite of algorithms for extracting and understanding non-verbal communicative cues (See Figure 3 for ongoing work); (3) haptic delivery system that translates visual non-verbal social cues into effective haptic cues (See Figure 4 for ongoing work); and (4) real-world evaluation with the target user population. The outcomes of this project will not only have a significant impact on the lives of individuals with visual impairments, but provide pathways to technologies for individuals with other disabilities, such as autism, prosopagnosia and frontotemporal dementia, and - in the most general sense - a very large portion of society.
StatusFinished
Effective start/end date9/1/118/31/15

Funding

  • National Science Foundation (NSF): $515,284.00

Fingerprint

Communication
Usability engineering
Haptic interfaces
Computer vision
Interfaces (computer)