The current state-of-the-art methods utilize multiple sensors and derive numerous features to perform human locomotion recognition. The use of multiple sensors causes discomfort to the users. Also, the derivation of numerous features for multiple sensors poses real-time processing issues. This paper presents a real-time human locomotion recognition system using a single inertial measurement unit placed on the human thigh to capture thigh angular data. Two features, amplitude ( A ) and omega ( omega ), are derived from the adaptive time window of thigh angular data. An adaptive time window is built using two consecutive peaks in the thigh angle signal. The machine learning algorithms, such as linear support vector machine and k -nearest neighbors, are employed on the A - omega feature space to perform the classification. A total number of six periodic activities are considered in this paper that includes level walking, stair ascent, stair descent, uphill, downhill, jogging, and running. The experiments are performed on six healthy subjects in controlled (motion capture laboratory) and uncontrolled (outdoor) environmental conditions to evaluate the efficacy of the algorithm. The A - omega feature-based method achieved a significant classification accuracy of 98.1% and 93.3% for controlled and uncontrolled experiments, respectively. Also, it reported a fast mean activity recognition time of 80 ms.
- gait recognition
- Human activity classification
- patient monitoring
- wearable robots
- wearable sensors
ASJC Scopus subject areas
- Electrical and Electronic Engineering