Modeling team-level multimodal dynamics during multiparty collaboration

Lucca Eloy, Angela E.B. Stewart, Mary J. Amon, Caroline Reindhardt, Amanda Michaels, Chen Sun, Valerie Shute, Nicholas D. Duran, Sidney K. D'Mello

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Scopus citations

Abstract

We adopt a multimodal approach to investigating team interactions in the context of remote collaborative problem solving (CPS). Our goal is to understand multimodal patterns that emerge and their relation with collaborative outcomes. We measured speech rate, body movement, and galvanic skin response from 101 triads (303 participants) who used video conferencing software to collaboratively solve challenging levels in an educational physics game. We use multi-dimensional recurrence quantification analysis (MdRQA) to quantify patterns of team-level regularity, or repeated patterns of activity in these three modalities. We found that teams exhibit significant regularity above chance baselines. Regularity was unaffected by task factors. but had a quadratic relationship with session time in that it initially increased but then decreased as the session progressed. Importantly, teams that produce more varied behavioral patterns (irregularity) reported higher emotional valence and performed better on a subset of the problem solving tasks. Regularity did not predict arousal or subjective perceptions of the collaboration. We discuss implications of our findings for the design of systems that aim to improve collaborative outcomes by monitoring the ongoing collaboration and intervening accordingly.

Original languageEnglish (US)
Title of host publicationICMI 2019 - Proceedings of the 2019 International Conference on Multimodal Interaction
EditorsWen Gao, Helen Mei Ling Meng, Matthew Turk, Susan R. Fussell, Bjorn Schuller, Bjorn Schuller, Yale Song, Kai Yu
PublisherAssociation for Computing Machinery, Inc
Pages244-258
Number of pages15
ISBN (Electronic)9781450368605
DOIs
StatePublished - Oct 14 2019
Externally publishedYes
Event21st ACM International Conference on Multimodal Interaction, ICMI 2019 - Suzhou, China
Duration: Oct 14 2019Oct 18 2019

Publication series

NameICMI 2019 - Proceedings of the 2019 International Conference on Multimodal Interaction

Conference

Conference21st ACM International Conference on Multimodal Interaction, ICMI 2019
CountryChina
CitySuzhou
Period10/14/1910/18/19

Keywords

  • Behavioral Regularity
  • Collaborative Problem Solving
  • MdRQA
  • Multimodal interaction

ASJC Scopus subject areas

  • Computer Science Applications
  • Computer Vision and Pattern Recognition
  • Hardware and Architecture
  • Human-Computer Interaction

Fingerprint Dive into the research topics of 'Modeling team-level multimodal dynamics during multiparty collaboration'. Together they form a unique fingerprint.

  • Cite this

    Eloy, L., Stewart, A. E. B., Amon, M. J., Reindhardt, C., Michaels, A., Sun, C., Shute, V., Duran, N. D., & D'Mello, S. K. (2019). Modeling team-level multimodal dynamics during multiparty collaboration. In W. Gao, H. M. Ling Meng, M. Turk, S. R. Fussell, B. Schuller, B. Schuller, Y. Song, & K. Yu (Eds.), ICMI 2019 - Proceedings of the 2019 International Conference on Multimodal Interaction (pp. 244-258). (ICMI 2019 - Proceedings of the 2019 International Conference on Multimodal Interaction). Association for Computing Machinery, Inc. https://doi.org/10.1145/3340555.3353748