Abstract

Recent popular emphasis on exercise for personal wellbeing has created a demand for techniques which monitor and classify human activities. Previous studies have shown promising results in applying various classification and feature extraction methods for identifying unique physical activities on various datasets. We apply learning techniques to GENEactiv accelerometer recordings to identify and monitor a wide range of daily activities. The dataset is composed of 92 participants, of ages 20-65, performing 25 unique activities, both ambulatory and non-ambulatory. The algorithm identified 130 different time and frequency domain features and selected the most efficient features with the sequential forward selection algorithm. With classification in two stages with both Gaussian mixture model (GMM) and hidden Markov model (HMM) we have combined the activities with similar features. We have also shown a comparative study between the two classifiers. We achieved an accuracy of 95.5% while classifying 10 unique activities with HMM and 89.7% while classifying 9. The most efficient result is obtained using HMM in 2-D feature space, where it is able to classify 15 unique activities at an accuracy of 90.12%.

Original languageEnglish (US)
Title of host publicationBSN 2016 - 13th Annual Body Sensor Networks Conference
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages359-364
Number of pages6
ISBN (Electronic)9781509030873
DOIs
StatePublished - Jul 18 2016
Event13th Annual Body Sensor Networks Conference, BSN 2016 - San Francisco, United States
Duration: Jun 14 2016Jun 17 2016

Other

Other13th Annual Body Sensor Networks Conference, BSN 2016
Country/TerritoryUnited States
CitySan Francisco
Period6/14/166/17/16

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Instrumentation
  • Biomedical Engineering

Fingerprint

Dive into the research topics of 'Learning approach for classification of GENEActiv accelerometer data for unique activity identification'. Together they form a unique fingerprint.

Cite this