TY - JOUR
T1 - The relationship between affective states and dialog patterns during interactions with auto tutor
AU - Graesser, Arthur C.
AU - D'Mello, Sidney K.
AU - Craig, Scotty D.
AU - Witherspoon, Amy
AU - Sullins, Jeremiah
AU - McDaniel, Bethany
AU - Gholson, Barry
PY - 2008
Y1 - 2008
N2 - Relations between emotions (affect states) and learning have recently been explored in the context of AutoTutor. AutoTutor is a tutoring system on the Internet that helps learners construct answers to difficult questions by interacting with them in natural language. AutoTutor has an animated conversation agent and a dialog management facility that attempts to comprehend the learner's contributions and to respond with appropriate dialog moves (such as short feedback, pumps, hints, prompts for information, assertions, answers to student questions, suggestions for actions, summaries). Our long-term goal is to build an adaptive AutoTutor that responds to the learners' affect states in addition to their cognitive states. The present study adopted an emote-aloud procedure in which participants were videotaped as they verbalized their affective states (called emotes) while interacting with AutoTutor on the subject matter of computer literacy. The emote-aloud protocols uncovered a number of affective states (notably confusion, frustration, and eureka/delight). The AutoTutor log files were mined to identify characteristics of the dialogue and the learners' knowledge states that were correlated with these affect states. We report the significant correlations and speculate on their implications for the larger project of building a nonintrusive, affect-sensitive AutoTutor.
AB - Relations between emotions (affect states) and learning have recently been explored in the context of AutoTutor. AutoTutor is a tutoring system on the Internet that helps learners construct answers to difficult questions by interacting with them in natural language. AutoTutor has an animated conversation agent and a dialog management facility that attempts to comprehend the learner's contributions and to respond with appropriate dialog moves (such as short feedback, pumps, hints, prompts for information, assertions, answers to student questions, suggestions for actions, summaries). Our long-term goal is to build an adaptive AutoTutor that responds to the learners' affect states in addition to their cognitive states. The present study adopted an emote-aloud procedure in which participants were videotaped as they verbalized their affective states (called emotes) while interacting with AutoTutor on the subject matter of computer literacy. The emote-aloud protocols uncovered a number of affective states (notably confusion, frustration, and eureka/delight). The AutoTutor log files were mined to identify characteristics of the dialogue and the learners' knowledge states that were correlated with these affect states. We report the significant correlations and speculate on their implications for the larger project of building a nonintrusive, affect-sensitive AutoTutor.
UR - http://www.scopus.com/inward/record.url?scp=46049120633&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=46049120633&partnerID=8YFLogxK
M3 - Article
AN - SCOPUS:46049120633
SN - 1093-023X
VL - 19
SP - 293
EP - 312
JO - Journal of Interactive Learning Research
JF - Journal of Interactive Learning Research
IS - 2
ER -