An evaluation of identity representability of facial expressions using feature distributions

Qi Li, Chandra Kambhamettu, Jieping Ye

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

The study on how to represent appearance instances was the focus in most previous work in face recognition. Little attention, however, was given to the problem of how to select "good" instances for a gallery, which may be called the facial identity representation problem. This paper gives an evaluation of the identity representability of facial expressions. The identity representability of an expression is measured by the recognition accuracy achieved by using its samples as the gallery data. We use feature distributions to represent appearance instances. A feature distribution of an image is based on the number of occurrence of detected interest points in regular grids of an image plane. We present a new algorithm of imbalance oriented candidate selection for interest point detection. Our experimental evaluation indicates that certain facial expressions, such as the neutral, have stronger identity representability than other expressions, in various feature distributions. An application of evaluation results towards improving linear discriminant analysis is further presented to show the value of our evaluation work.

Original languageEnglish (US)
Pages (from-to)1902-1912
Number of pages11
JournalNeurocomputing
Volume71
Issue number10-12
DOIs
StatePublished - Jun 2008

Keywords

  • Face recognition
  • Facial expression
  • Interest points
  • Linear discriminant analysis

ASJC Scopus subject areas

  • Computer Science Applications
  • Cognitive Neuroscience
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'An evaluation of identity representability of facial expressions using feature distributions'. Together they form a unique fingerprint.

Cite this