TY - JOUR
T1 - Emote aloud during learning with AutoTutor
T2 - Applying the Facial Action Coding System to cognitive - Affective states during learning
AU - Craig, Scotty D.
AU - D'Mello, Sidney
AU - Witherspoon, Amy
AU - Graesser, Art
N1 - Funding Information:
Correspondence should be addressed to: Art Graesser, 202 Psychology building, The University of Memphis, Memphis, TN 38152, USA. E-mail: a-graesser@memphis.edu This research conducted by the authors was supported by the National Science Foundation (REC 0106965, ITR 0325428, REC 0633918) the Tutoring Research Group (visit http:// www.autotutor.org). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of NSF.
PY - 2008
Y1 - 2008
N2 - In an attempt to discover the facial action units for affective states that occur during complex learning, this study adopted an emote-aloud procedure in which participants were recorded as they verbalised their affective states while interacting with an intelligent tutoring system (AutoTutor). Participants' facial expressions were coded by two expert raters using Ekman's Facial Action Coding System and analysed using association rule mining techniques. The two expert raters received an overall kappa that ranged between .76 and .84. The association rule mining analysis uncovered facial actions associated with confusion, frustration, and boredom. We discuss these rules and the prospects of enhancing AutoTutor with non-intrusive affect-sensitive capabilities.
AB - In an attempt to discover the facial action units for affective states that occur during complex learning, this study adopted an emote-aloud procedure in which participants were recorded as they verbalised their affective states while interacting with an intelligent tutoring system (AutoTutor). Participants' facial expressions were coded by two expert raters using Ekman's Facial Action Coding System and analysed using association rule mining techniques. The two expert raters received an overall kappa that ranged between .76 and .84. The association rule mining analysis uncovered facial actions associated with confusion, frustration, and boredom. We discuss these rules and the prospects of enhancing AutoTutor with non-intrusive affect-sensitive capabilities.
UR - http://www.scopus.com/inward/record.url?scp=47249086892&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=47249086892&partnerID=8YFLogxK
U2 - 10.1080/02699930701516759
DO - 10.1080/02699930701516759
M3 - Article
AN - SCOPUS:47249086892
SN - 0269-9931
VL - 22
SP - 777
EP - 788
JO - Cognition and Emotion
JF - Cognition and Emotion
IS - 5
ER -