Emote aloud during learning with AutoTutor: Applying the Facial Action Coding System to cognitive - Affective states during learning

Scotty D. Craig, Sidney D'Mello, Amy Witherspoon, Art Graesser

Research output: Contribution to journalArticlepeer-review

118 Scopus citations

Abstract

In an attempt to discover the facial action units for affective states that occur during complex learning, this study adopted an emote-aloud procedure in which participants were recorded as they verbalised their affective states while interacting with an intelligent tutoring system (AutoTutor). Participants' facial expressions were coded by two expert raters using Ekman's Facial Action Coding System and analysed using association rule mining techniques. The two expert raters received an overall kappa that ranged between .76 and .84. The association rule mining analysis uncovered facial actions associated with confusion, frustration, and boredom. We discuss these rules and the prospects of enhancing AutoTutor with non-intrusive affect-sensitive capabilities.

Original languageEnglish (US)
Pages (from-to)777-788
Number of pages12
JournalCognition and Emotion
Volume22
Issue number5
DOIs
StatePublished - 2008
Externally publishedYes

ASJC Scopus subject areas

  • Experimental and Cognitive Psychology
  • Developmental and Educational Psychology
  • Arts and Humanities (miscellaneous)

Fingerprint

Dive into the research topics of 'Emote aloud during learning with AutoTutor: Applying the Facial Action Coding System to cognitive - Affective states during learning'. Together they form a unique fingerprint.

Cite this