A comparison of techniques for sign language alphabet recognition using armband wearables

Prajwal Paudyal, Junghyo Lee, Ayan Banerjee, Sandeep Gupta

Research output: Contribution to journalArticle

1 Citation (Scopus)

Abstract

Recent research has shown that reliable recognition of sign language words and phrases using user-friendly and noninvasive armbands is feasible and desirable. This work provides an analysis and implementation of including fingerspelling recognition (FR) in such systems, which is a much harder problem due to lack of distinctive hand movements. A novel algorithm called DyFAV (Dynamic Feature Selection and Voting) is proposed for this purpose that exploits the fact that fingerspelling has a finite corpus (26 alphabets for the American Sign Language (ASL)). Detailed analysis of the algorithm used as well as comparisons with other traditional machine-learning algorithms is provided. The system uses an independent multiple-agent voting approach to identify letters with high accuracy. The independent voting of the agents ensures that the algorithm is highly parallelizable and thus recognition times can be kept low to suit real-time mobile applications. A thorough explanation and analysis is presented on results obtained on the ASL alphabet corpus for nine people with limited training. An average recognition accuracy of 95.36% is reported and compared with recognition results from other machine-learning techniques. This result is extended by including six additional validation users with data collected under similar settings as the previous dataset. Furthermore, a feature selection schema using a subset of the sensors is proposed and the results are evaluated. The mobile, noninvasive, and real-time nature of the technology is demonstrated by evaluating performance on various types of Android phones and remote server configurations. A brief discussion of the user interface is provided along with guidelines for best practices.

Original languageEnglish (US)
Article number14
JournalACM Transactions on Interactive Intelligent Systems
Volume9
Issue number2-3
DOIs
StatePublished - Apr 1 2019

Fingerprint

Learning systems
Feature extraction
Learning algorithms
User interfaces
Servers
Sensors

Keywords

  • Activity recognition
  • Digital signal processing
  • Machine learning
  • Sign language recognition
  • Wearables

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Artificial Intelligence

Cite this

A comparison of techniques for sign language alphabet recognition using armband wearables. / Paudyal, Prajwal; Lee, Junghyo; Banerjee, Ayan; Gupta, Sandeep.

In: ACM Transactions on Interactive Intelligent Systems, Vol. 9, No. 2-3, 14, 01.04.2019.

Research output: Contribution to journalArticle

@article{b2384197d88340cb98b5c0aab776fc7d,
title = "A comparison of techniques for sign language alphabet recognition using armband wearables",
abstract = "Recent research has shown that reliable recognition of sign language words and phrases using user-friendly and noninvasive armbands is feasible and desirable. This work provides an analysis and implementation of including fingerspelling recognition (FR) in such systems, which is a much harder problem due to lack of distinctive hand movements. A novel algorithm called DyFAV (Dynamic Feature Selection and Voting) is proposed for this purpose that exploits the fact that fingerspelling has a finite corpus (26 alphabets for the American Sign Language (ASL)). Detailed analysis of the algorithm used as well as comparisons with other traditional machine-learning algorithms is provided. The system uses an independent multiple-agent voting approach to identify letters with high accuracy. The independent voting of the agents ensures that the algorithm is highly parallelizable and thus recognition times can be kept low to suit real-time mobile applications. A thorough explanation and analysis is presented on results obtained on the ASL alphabet corpus for nine people with limited training. An average recognition accuracy of 95.36{\%} is reported and compared with recognition results from other machine-learning techniques. This result is extended by including six additional validation users with data collected under similar settings as the previous dataset. Furthermore, a feature selection schema using a subset of the sensors is proposed and the results are evaluated. The mobile, noninvasive, and real-time nature of the technology is demonstrated by evaluating performance on various types of Android phones and remote server configurations. A brief discussion of the user interface is provided along with guidelines for best practices.",
keywords = "Activity recognition, Digital signal processing, Machine learning, Sign language recognition, Wearables",
author = "Prajwal Paudyal and Junghyo Lee and Ayan Banerjee and Sandeep Gupta",
year = "2019",
month = "4",
day = "1",
doi = "10.1145/3150974",
language = "English (US)",
volume = "9",
journal = "ACM Transactions on Interactive Intelligent Systems",
issn = "2160-6455",
publisher = "Association for Computing Machinery (ACM)",
number = "2-3",

}

TY - JOUR

T1 - A comparison of techniques for sign language alphabet recognition using armband wearables

AU - Paudyal, Prajwal

AU - Lee, Junghyo

AU - Banerjee, Ayan

AU - Gupta, Sandeep

PY - 2019/4/1

Y1 - 2019/4/1

N2 - Recent research has shown that reliable recognition of sign language words and phrases using user-friendly and noninvasive armbands is feasible and desirable. This work provides an analysis and implementation of including fingerspelling recognition (FR) in such systems, which is a much harder problem due to lack of distinctive hand movements. A novel algorithm called DyFAV (Dynamic Feature Selection and Voting) is proposed for this purpose that exploits the fact that fingerspelling has a finite corpus (26 alphabets for the American Sign Language (ASL)). Detailed analysis of the algorithm used as well as comparisons with other traditional machine-learning algorithms is provided. The system uses an independent multiple-agent voting approach to identify letters with high accuracy. The independent voting of the agents ensures that the algorithm is highly parallelizable and thus recognition times can be kept low to suit real-time mobile applications. A thorough explanation and analysis is presented on results obtained on the ASL alphabet corpus for nine people with limited training. An average recognition accuracy of 95.36% is reported and compared with recognition results from other machine-learning techniques. This result is extended by including six additional validation users with data collected under similar settings as the previous dataset. Furthermore, a feature selection schema using a subset of the sensors is proposed and the results are evaluated. The mobile, noninvasive, and real-time nature of the technology is demonstrated by evaluating performance on various types of Android phones and remote server configurations. A brief discussion of the user interface is provided along with guidelines for best practices.

AB - Recent research has shown that reliable recognition of sign language words and phrases using user-friendly and noninvasive armbands is feasible and desirable. This work provides an analysis and implementation of including fingerspelling recognition (FR) in such systems, which is a much harder problem due to lack of distinctive hand movements. A novel algorithm called DyFAV (Dynamic Feature Selection and Voting) is proposed for this purpose that exploits the fact that fingerspelling has a finite corpus (26 alphabets for the American Sign Language (ASL)). Detailed analysis of the algorithm used as well as comparisons with other traditional machine-learning algorithms is provided. The system uses an independent multiple-agent voting approach to identify letters with high accuracy. The independent voting of the agents ensures that the algorithm is highly parallelizable and thus recognition times can be kept low to suit real-time mobile applications. A thorough explanation and analysis is presented on results obtained on the ASL alphabet corpus for nine people with limited training. An average recognition accuracy of 95.36% is reported and compared with recognition results from other machine-learning techniques. This result is extended by including six additional validation users with data collected under similar settings as the previous dataset. Furthermore, a feature selection schema using a subset of the sensors is proposed and the results are evaluated. The mobile, noninvasive, and real-time nature of the technology is demonstrated by evaluating performance on various types of Android phones and remote server configurations. A brief discussion of the user interface is provided along with guidelines for best practices.

KW - Activity recognition

KW - Digital signal processing

KW - Machine learning

KW - Sign language recognition

KW - Wearables

UR - http://www.scopus.com/inward/record.url?scp=85065186430&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85065186430&partnerID=8YFLogxK

U2 - 10.1145/3150974

DO - 10.1145/3150974

M3 - Article

VL - 9

JO - ACM Transactions on Interactive Intelligent Systems

JF - ACM Transactions on Interactive Intelligent Systems

SN - 2160-6455

IS - 2-3

M1 - 14

ER -