Abstract

Affective computing researchers have recently been focusing on continuous emotion dimensions like arousal and valence. This dual coordinate affect space can explain many of the discrete emotions like sadness, anger, joy, etc. In the area of continuous emotion recognition, Principal Component Analysis (PCA) models are generally used to enhance the performance of various image and audio features by projecting them to a new space where the new features are less correlated. We instead, propose that quantizing and projecting the features to a latent topic space performs better than PCA. Specifically we extract these topic features using Latent Dirichlet Allocation (LDA) models. We show that topic models project the original features to a latent feature space that is more coherent and useful for continuous emotion recognition than PCA. Unlike PCA where no semantics can be attributed to the new features, topic features can have a visual and semantic interpretation which can be used in personalized HCI applications and Assistive technologies. Our hypothesis in this work has been validated using the AVEC 2012 continuous emotion challenge dataset.

Original languageEnglish (US)
Title of host publicationMM 2014 - Proceedings of the 2014 ACM Conference on Multimedia
PublisherAssociation for Computing Machinery
Pages881-884
Number of pages4
ISBN (Electronic)9781450330633
DOIs
StatePublished - Nov 3 2014
Event2014 ACM Conference on Multimedia, MM 2014 - Orlando, United States
Duration: Nov 3 2014Nov 7 2014

Publication series

NameMM 2014 - Proceedings of the 2014 ACM Conference on Multimedia

Other

Other2014 ACM Conference on Multimedia, MM 2014
Country/TerritoryUnited States
CityOrlando
Period11/3/1411/7/14

Keywords

  • Feature comparison
  • Topic models continuous affect recognition

ASJC Scopus subject areas

  • Computer Graphics and Computer-Aided Design
  • Computer Vision and Pattern Recognition
  • Media Technology
  • Software

Fingerprint

Dive into the research topics of 'Semantic feature projection for continuous emotion analysis'. Together they form a unique fingerprint.

Cite this