A new gestural control paradigm for musical expression: Real-time conducting analysis via temporal expectancy models

Dilip Swaminathan, Harvey Thornburg, Todd Ingalls, Jodi James, Stjepan Rajko, Kathleya Afanador

Research output: Contribution to conferencePaperpeer-review

Abstract

Most event sequences in everyday human movement exhibit temporal structure: for instance, footsteps in walking, the striking of balls in a tennis match, the movements of a dancer set to rhythmic music, and the gestures of an orchestra conductor. These events generate prior expectancies regarding the occurrence of future events. Moreover, these expectancies play a critical role in conveying expressive qualities and communicative intent through themovement; thus they are of considerable interest in expressive musical control contexts. To this end, we introduce a novel gestural control paradigm for musical expression based on temporal expectancies induced by human movement via a general Bayesian framework called temporal expectancy network. We realize this paradigm in the form of a conducting analysis tool which infers beat, tempo, and articulation jointly with temporal expectancies regarding beat (ictus and preparation instances) from conducting gesture. Our system operates on data obtained from a marker based motion capture system, but can be easily adapted for more affordable technologies combining video cameras and inertial sensors. Using our analysis framework, we observe a significant effect on the patterns of temporal expectancies generated through varying expressive qualities of the gesture (e.g., staccato vs legato articulation) which at least partially confirms the role of temporal expectancies in musical expression.

Original languageEnglish (US)
Pages348-355
Number of pages8
StatePublished - 2007
EventInternational Computer Music Conference, ICMC 2007 - Copenhagen, Denmark
Duration: Aug 27 2007Aug 31 2007

Conference

ConferenceInternational Computer Music Conference, ICMC 2007
Country/TerritoryDenmark
CityCopenhagen
Period8/27/078/31/07

ASJC Scopus subject areas

  • Media Technology
  • Computer Science Applications
  • Music

Fingerprint

Dive into the research topics of 'A new gestural control paradigm for musical expression: Real-time conducting analysis via temporal expectancy models'. Together they form a unique fingerprint.

Cite this