Using the Tablet Gestures and Speech of Pairs of Students to Classify Their Collaboration

Sree Aurovindh Viswanathan, Kurt VanLehn

Research output: Contribution to journalArticle

6 Scopus citations

Abstract

Effective collaboration between student peers is not spontaneous. A system that can measure collaboration in real-time may be useful, as it could alert an instructor to pairs that need help in collaborating effectively. We tested whether superficial measures of speech and user interface actions would suffice for measuring collaboration. Pairs of students solved complex math problems while data were collected in the form of verbal interaction and user action logs from the students' tablets. We distinguished four classifications of interactivity: collaboration, cooperation, high asymmetric contribution and low asymmetric contribution. Human coders used richer data (several video streams) to choose one of these codes for each episode. Thousands of features were extracted computationally from the log and audio data. Machine learning was used to induce a detector that also assigned a code to each episode as a function of these features. Detectors for combinations of codes were induced as well. The best detector's overall accuracy was 96 percent (kappa = 0.92) compared to human coding. This high level of agreement suggests that superficial features of speech and log data do suffice for measuring collaboration. However, these results should be viewed as preliminary because the particular task may have made it relatively easy to distinguish collaboration from cooperation.

Original languageEnglish (US)
Pages (from-to)230-242
Number of pages13
JournalIEEE Transactions on Learning Technologies
Volume11
Issue number2
DOIs
StatePublished - Apr 1 2018

    Fingerprint

Keywords

  • Collaborative learning
  • educational data mining
  • learning analytics
  • machine learning

ASJC Scopus subject areas

  • Education
  • Engineering(all)
  • Computer Science Applications

Cite this