Share-N-learn: A framework for sharing activity recognition models in wearable systems with context-varying sensors

Seyed Ali Rokni, Hassan Ghasemzadeh

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

Wearable sensors utilize machine learning algorithms to infer important events such as the behavioral routine and health status of their end users from time-series sensor data. A major obstacle in large-scale utilization of these systems is that the machine learning algorithms cannot be shared among users or reused in contexts different from the setting in which the training data are collected. As a result, the algorithms need to be retrained from scratch in new sensor contexts, such as when the on-body location of the wearable sensor changes or when the system is utilized by a new user. The retraining process places significant burden on end users and system designers to collect and label large amounts of training sensor data. In this article, we challenge the current algorithm training paradigm and introduce Share-n-Learn to automatically detect and learn physical sensor contexts from a repository of shared expert models without collecting any new labeled training data. Share-n-Learn enables system designers and end users to seamlessly share and reuse machine learning algorithms that are trained under different contexts and data collection settings. We develop algorithms to autonomously identify sensor contexts and propose a gating function to automatically activate the most accurate machine learning model among the set of shared expert models. We assess the performance of Share-n-Learn for activity recognition when a dynamic sensor constantly migrates from one body location to another. Our analysis based on real data collected with human subjects on three datasets demonstrates that Share-n-Learn achieves, on average, 68.4% accuracy in detecting physical activities with context-varying wearables. This accuracy performance is about 19% more than 'majority voting,' 10% more than the state-ofthe-art transfer learning, and only 8% less than the experimental upper bound.

Original languageEnglish (US)
Article number39
JournalACM Transactions on Design Automation of Electronic Systems
Volume24
Issue number4
DOIs
StatePublished - 2019
Externally publishedYes

Keywords

  • Activity recognition
  • Machine learning
  • Transfer learning
  • Wearable sensors

ASJC Scopus subject areas

  • Computer Science Applications
  • Computer Graphics and Computer-Aided Design
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Share-N-learn: A framework for sharing activity recognition models in wearable systems with context-varying sensors'. Together they form a unique fingerprint.

Cite this