TY - GEN
T1 - Assessing Multimodal Dynamics in Multi-Party Collaborative Interactions with Multi-Level Vector Autoregression
AU - Moulder, Robert G.
AU - Duran, Nicholas D.
AU - D'Mello, Sidney K.
N1 - Funding Information:
This research was supported by the NSF National AI Institute for Student-AI Teaming (iSAT) (DRL 2019805) and NSF DUE 1745442/1660877. The opinions expressed are those of the authors and do not represent the views of the funding agency.
Publisher Copyright:
© 2022 Owner/Author.
PY - 2022/11/7
Y1 - 2022/11/7
N2 - Multi-level vector autoregression (mlVAR) is a recently developed dynamic network model for assessing multimodal temporal data streams derived from multiple users over time. Importantly, mlVAR facilitates investigations into highly complex collaborative interactions within a unified framework. In order to demonstrate the utility of mlVAR for understanding the temporal dynamics of multimodal multi-party (MMP) interactions, we apply it to 9 signals measured from 201 users (67 triads) who engaged in a 15-minute collaborative problem solving task. Measured signals reflect participants' affective states (positive valence and negative valence), physiological states (skin conductance and heart rate), attention (gaze fixation duration and gaze dispersion), nonverbal communication (head acceleration and facial expressiveness), and verbal communication (speech rate). Using node-level metrics of in-strength, out-strength, and synchrony, we show that mlVAR is capable of teasing apart complex role-based dynamics (controller, primary contributor, or secondary contributor) between participants. Our findings also provide evidence for a complex feedback system between individuals where internal states (i.e., skin conductance) are influenced by external signals of shared attention and communication (i.e., gaze and speech).
AB - Multi-level vector autoregression (mlVAR) is a recently developed dynamic network model for assessing multimodal temporal data streams derived from multiple users over time. Importantly, mlVAR facilitates investigations into highly complex collaborative interactions within a unified framework. In order to demonstrate the utility of mlVAR for understanding the temporal dynamics of multimodal multi-party (MMP) interactions, we apply it to 9 signals measured from 201 users (67 triads) who engaged in a 15-minute collaborative problem solving task. Measured signals reflect participants' affective states (positive valence and negative valence), physiological states (skin conductance and heart rate), attention (gaze fixation duration and gaze dispersion), nonverbal communication (head acceleration and facial expressiveness), and verbal communication (speech rate). Using node-level metrics of in-strength, out-strength, and synchrony, we show that mlVAR is capable of teasing apart complex role-based dynamics (controller, primary contributor, or secondary contributor) between participants. Our findings also provide evidence for a complex feedback system between individuals where internal states (i.e., skin conductance) are influenced by external signals of shared attention and communication (i.e., gaze and speech).
KW - collaborative problem solving
KW - network analysis
KW - time series
UR - http://www.scopus.com/inward/record.url?scp=85142815696&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85142815696&partnerID=8YFLogxK
U2 - 10.1145/3536221.3556595
DO - 10.1145/3536221.3556595
M3 - Conference contribution
AN - SCOPUS:85142815696
T3 - ACM International Conference Proceeding Series
SP - 615
EP - 625
BT - ICMI 2022 - Proceedings of the 2022 International Conference on Multimodal Interaction
PB - Association for Computing Machinery
T2 - 24th ACM International Conference on Multimodal Interaction, ICMI 2022
Y2 - 7 November 2022 through 11 November 2022
ER -