Development of a Responsive Emotive Sensing System (DRESS)

Project: Research project

Project Details


Development of a Responsive Emotive Sensing System (DRESS) Development of a Responsive Emotive Sensing System (DRESS) This R21s FOA encourages the development of novel technologies that are aimed at sustaining independent living for people with chronic disabling conditions or reducing the burden on care providers. Our goal is to do both, to develop a new technology for a very high risk, high cost population persons with cognitive impairment (PWCI) from Alzheimers disease and their lay caregivers. The majority (64%) of primary family caregivers of PWCI report that daily dressing struggles are their most common source of caregiving distress yet supportive interventions are absent. In response we are proposing the Development of a Responsive Emotive Sensing System (DRESS) computer mediated automated application using for the first time context- aware affective computing for PWCI. We will innovatively integrate a means to assess ones individual stress responses through the use of wireless skin conductance sensing. Using our wristband sensor, the emotional responsiveness of the user is transmitted to the DRESS system. DRESS will be developed to integrate the sensor data in real time to enable automated verbal and video dressing prompting that tailors to the PWCIs unique functional capabilities and responsiveness. DRESS is designed to be retrofitted in the users home bureau attending to usability and human factors issues from the gerontechnology field to ensure practicality and ease of use. It will be built using the GaLLaG technology platform which has established interoperability and flexibility to facilitate prototype scalability. In this pilot feasibility study, we advance from our affirming proof of concept study to developing and testing an alpha version LAB model of DRESS
Effective start/end date4/3/148/31/14


  • HHS-NIH: National Institute for Nursing Research (NINR): $29,962.00


Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.