TY - JOUR
T1 - A Two-Dimensional Feature Space-Based Approach for Human Locomotion Recognition
AU - Chinimilli, Prudhvi Tej
AU - Redkar, Sangram
AU - Sugar, Thomas
N1 - Funding Information:
Manuscript received November 3, 2018; revised January 11, 2019; accepted January 12, 2019. Date of publication January 25, 2019; date of current version May 6, 2019. This work was partially supported by the funding from DoD RIF program “ExoSense: Sensing and Control Suite / Suit” (Contract No. W911NF-17-C-0044, NextGen Aeronautics (prime) and Arizona State University (sub). The associate editor coordinating the review of this paper and approving it for publication was Dr. Edward Sazonov. (Corresponding author: Sangram Redkar.) The authors are with the Polytechnic School, Ira A. Fulton Schools of Engineering, Arizona State University, Mesa, AZ 85212 USA (e-mail: pchinimi@asu.edu; sredkar@asu.edu; thomas.sugar@asu.edu). Digital Object Identifier 10.1109/JSEN.2019.2895289
Funding Information:
ACKNOWLEDGMENT The authors acknowledge Dr. Jayanth Kudva, NextGen Aeronautics for his assistance. The support from TPOCs Dr. Stephanie McElhinny (ARO), and CDR Jefferson Grubb (USSOCOM SOF ATL) is gratefully acknowledged.
Publisher Copyright:
© 2001-2012 IEEE.
PY - 2019/6/1
Y1 - 2019/6/1
N2 - The current state-of-the-art methods utilize multiple sensors and derive numerous features to perform human locomotion recognition. The use of multiple sensors causes discomfort to the users. Also, the derivation of numerous features for multiple sensors poses real-time processing issues. This paper presents a real-time human locomotion recognition system using a single inertial measurement unit placed on the human thigh to capture thigh angular data. Two features, amplitude ( A ) and omega ( omega ), are derived from the adaptive time window of thigh angular data. An adaptive time window is built using two consecutive peaks in the thigh angle signal. The machine learning algorithms, such as linear support vector machine and k -nearest neighbors, are employed on the A - omega feature space to perform the classification. A total number of six periodic activities are considered in this paper that includes level walking, stair ascent, stair descent, uphill, downhill, jogging, and running. The experiments are performed on six healthy subjects in controlled (motion capture laboratory) and uncontrolled (outdoor) environmental conditions to evaluate the efficacy of the algorithm. The A - omega feature-based method achieved a significant classification accuracy of 98.1% and 93.3% for controlled and uncontrolled experiments, respectively. Also, it reported a fast mean activity recognition time of 80 ms.
AB - The current state-of-the-art methods utilize multiple sensors and derive numerous features to perform human locomotion recognition. The use of multiple sensors causes discomfort to the users. Also, the derivation of numerous features for multiple sensors poses real-time processing issues. This paper presents a real-time human locomotion recognition system using a single inertial measurement unit placed on the human thigh to capture thigh angular data. Two features, amplitude ( A ) and omega ( omega ), are derived from the adaptive time window of thigh angular data. An adaptive time window is built using two consecutive peaks in the thigh angle signal. The machine learning algorithms, such as linear support vector machine and k -nearest neighbors, are employed on the A - omega feature space to perform the classification. A total number of six periodic activities are considered in this paper that includes level walking, stair ascent, stair descent, uphill, downhill, jogging, and running. The experiments are performed on six healthy subjects in controlled (motion capture laboratory) and uncontrolled (outdoor) environmental conditions to evaluate the efficacy of the algorithm. The A - omega feature-based method achieved a significant classification accuracy of 98.1% and 93.3% for controlled and uncontrolled experiments, respectively. Also, it reported a fast mean activity recognition time of 80 ms.
KW - Human activity classification
KW - gait recognition
KW - patient monitoring
KW - wearable robots
KW - wearable sensors
UR - http://www.scopus.com/inward/record.url?scp=85065450860&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85065450860&partnerID=8YFLogxK
U2 - 10.1109/JSEN.2019.2895289
DO - 10.1109/JSEN.2019.2895289
M3 - Article
AN - SCOPUS:85065450860
SN - 1530-437X
VL - 19
SP - 4271
EP - 4282
JO - IEEE Sensors Journal
JF - IEEE Sensors Journal
IS - 11
M1 - 8626125
ER -