TY - GEN
T1 - Generation of movement explanations for testing gesture based co-operative learning applications
AU - Banerjee, Ayan
AU - Lamrani, Imane
AU - Paudyal, Prajwal
AU - Gupta, Sandeep
N1 - Funding Information:
The authors acknowledge the contributions from Arizona Commission for Deaf and Hard of Hearing (ACDHH) for helping us get access to the deaf community, and NSF grants IIS 1116385.
PY - 2019/5/17
Y1 - 2019/5/17
N2 - The paper proposes an explanation framework for machine learning based gesture recognition systems to increase trust, and also provide users an interface to ask questions about the recognition result. Gestures have three components: a) handshape, b) location, and c) movement. Several techniques exist for handshape and location recognition explanation, but very limited analysis exists on explaining movement. The challenge is that modeling the movement between handshapes in a gesture require dynamic modeling of arm kinematics using differential equations. The arm models can be of various complexity, but many of them may not be explainable. Our approach in this paper, is to mine hybrid system models of gestures using a coalition of hand-shape recognition technology and explainable kinematic models. The hybrid dynamical systems are mined using video data collected from users. Change in dynamics of a test user is expressed using the parameters of the kinematic equations. The parameters are converted into human understandable explanations by experts in movement analysis. The novel outcome is the combination of fault detection in hybrid dynamical systems and machine learning to provide explanation for recognition of continuous events. We have applied our technique on 60 users for 20 ASL gestures. Results show that the mined parameters of the kinematic equations can represent each gesture with precision of 83 %, recall of 80 % and accuracy of 82 %.
AB - The paper proposes an explanation framework for machine learning based gesture recognition systems to increase trust, and also provide users an interface to ask questions about the recognition result. Gestures have three components: a) handshape, b) location, and c) movement. Several techniques exist for handshape and location recognition explanation, but very limited analysis exists on explaining movement. The challenge is that modeling the movement between handshapes in a gesture require dynamic modeling of arm kinematics using differential equations. The arm models can be of various complexity, but many of them may not be explainable. Our approach in this paper, is to mine hybrid system models of gestures using a coalition of hand-shape recognition technology and explainable kinematic models. The hybrid dynamical systems are mined using video data collected from users. Change in dynamics of a test user is expressed using the parameters of the kinematic equations. The parameters are converted into human understandable explanations by experts in movement analysis. The novel outcome is the combination of fault detection in hybrid dynamical systems and machine learning to provide explanation for recognition of continuous events. We have applied our technique on 60 users for 20 ASL gestures. Results show that the mined parameters of the kinematic equations can represent each gesture with precision of 83 %, recall of 80 % and accuracy of 82 %.
KW - Artificial intelligence
KW - Explainable AI
KW - Gesture recognition
KW - Testing
UR - http://www.scopus.com/inward/record.url?scp=85067127184&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85067127184&partnerID=8YFLogxK
U2 - 10.1109/AITest.2019.00-15
DO - 10.1109/AITest.2019.00-15
M3 - Conference contribution
AN - SCOPUS:85067127184
T3 - Proceedings - 2019 IEEE International Conference on Artificial Intelligence Testing, AITest 2019
SP - 9
EP - 16
BT - Proceedings - 2019 IEEE International Conference on Artificial Intelligence Testing, AITest 2019
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 1st IEEE International Conference on Artificial Intelligence Testing, AITest 2019
Y2 - 4 April 2019 through 9 April 2019
ER -