Two-dimensional linear discriminant analysis

Jieping Ye, Ravi Janardan, Qi Li

Research output: Chapter in Book/Report/Conference proceedingConference contribution

141 Citations (Scopus)

Abstract

Linear Discriminant Analysis (LDA) is a well-known scheme for feature extraction and dimension reduction. It has been used widely in many applications involving high-dimensional data, such as face recognition and image retrieval. An intrinsic limitation of classical LDA is the so-called singularity problem, that is, it fails when all scatter matrices are singular. A well-known approach to deal with the singularity problem is to apply an intermediate dimension reduction stage using Principal Component Analysis (PCA) before LDA. The algorithm, called PCA+LDA, is used widely in face recognition. However, PCA+LDA has high costs in time and space, due to the need for an eigen-decomposition involving the scatter matrices. In this paper, we propose a novel LDA algorithm, namely 2DLDA, which stands for 2-Dimensional Linear Discriminant Analysis. 2DLDA overcomes the singularity problem implicitly, while achieving efficiency. The key difference between 2DLDA and classical LDA lies in the model for data representation. Classical LDA works with vectorized representations of data, while the 2DLDA algorithm works with data in matrix representation. To further reduce the dimension by 2DLDA, the combination of 2DLDA and classical LDA, namely 2DLDA+LDA, is studied, where LDA is preceded by 2DLDA. The proposed algorithms are applied on face recognition and compared with PCA+LDA. Experiments show that 2DLDA and 2DLDA+LDA achieve competitive recognition accuracy, while being much more efficient.

Original languageEnglish (US)
Title of host publicationAdvances in Neural Information Processing Systems
PublisherNeural information processing systems foundation
ISBN (Print)0262195348, 9780262195348
StatePublished - 2005
Externally publishedYes
Event18th Annual Conference on Neural Information Processing Systems, NIPS 2004 - Vancouver, BC, Canada
Duration: Dec 13 2004Dec 16 2004

Other

Other18th Annual Conference on Neural Information Processing Systems, NIPS 2004
CountryCanada
CityVancouver, BC
Period12/13/0412/16/04

Fingerprint

Discriminant analysis
Principal component analysis
Face recognition
Image retrieval
Feature extraction

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Cite this

Ye, J., Janardan, R., & Li, Q. (2005). Two-dimensional linear discriminant analysis. In Advances in Neural Information Processing Systems Neural information processing systems foundation.

Two-dimensional linear discriminant analysis. / Ye, Jieping; Janardan, Ravi; Li, Qi.

Advances in Neural Information Processing Systems. Neural information processing systems foundation, 2005.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Ye, J, Janardan, R & Li, Q 2005, Two-dimensional linear discriminant analysis. in Advances in Neural Information Processing Systems. Neural information processing systems foundation, 18th Annual Conference on Neural Information Processing Systems, NIPS 2004, Vancouver, BC, Canada, 12/13/04.
Ye J, Janardan R, Li Q. Two-dimensional linear discriminant analysis. In Advances in Neural Information Processing Systems. Neural information processing systems foundation. 2005
Ye, Jieping ; Janardan, Ravi ; Li, Qi. / Two-dimensional linear discriminant analysis. Advances in Neural Information Processing Systems. Neural information processing systems foundation, 2005.
@inproceedings{0587655c3e05402e90f9d301793b964b,
title = "Two-dimensional linear discriminant analysis",
abstract = "Linear Discriminant Analysis (LDA) is a well-known scheme for feature extraction and dimension reduction. It has been used widely in many applications involving high-dimensional data, such as face recognition and image retrieval. An intrinsic limitation of classical LDA is the so-called singularity problem, that is, it fails when all scatter matrices are singular. A well-known approach to deal with the singularity problem is to apply an intermediate dimension reduction stage using Principal Component Analysis (PCA) before LDA. The algorithm, called PCA+LDA, is used widely in face recognition. However, PCA+LDA has high costs in time and space, due to the need for an eigen-decomposition involving the scatter matrices. In this paper, we propose a novel LDA algorithm, namely 2DLDA, which stands for 2-Dimensional Linear Discriminant Analysis. 2DLDA overcomes the singularity problem implicitly, while achieving efficiency. The key difference between 2DLDA and classical LDA lies in the model for data representation. Classical LDA works with vectorized representations of data, while the 2DLDA algorithm works with data in matrix representation. To further reduce the dimension by 2DLDA, the combination of 2DLDA and classical LDA, namely 2DLDA+LDA, is studied, where LDA is preceded by 2DLDA. The proposed algorithms are applied on face recognition and compared with PCA+LDA. Experiments show that 2DLDA and 2DLDA+LDA achieve competitive recognition accuracy, while being much more efficient.",
author = "Jieping Ye and Ravi Janardan and Qi Li",
year = "2005",
language = "English (US)",
isbn = "0262195348",
booktitle = "Advances in Neural Information Processing Systems",
publisher = "Neural information processing systems foundation",

}

TY - GEN

T1 - Two-dimensional linear discriminant analysis

AU - Ye, Jieping

AU - Janardan, Ravi

AU - Li, Qi

PY - 2005

Y1 - 2005

N2 - Linear Discriminant Analysis (LDA) is a well-known scheme for feature extraction and dimension reduction. It has been used widely in many applications involving high-dimensional data, such as face recognition and image retrieval. An intrinsic limitation of classical LDA is the so-called singularity problem, that is, it fails when all scatter matrices are singular. A well-known approach to deal with the singularity problem is to apply an intermediate dimension reduction stage using Principal Component Analysis (PCA) before LDA. The algorithm, called PCA+LDA, is used widely in face recognition. However, PCA+LDA has high costs in time and space, due to the need for an eigen-decomposition involving the scatter matrices. In this paper, we propose a novel LDA algorithm, namely 2DLDA, which stands for 2-Dimensional Linear Discriminant Analysis. 2DLDA overcomes the singularity problem implicitly, while achieving efficiency. The key difference between 2DLDA and classical LDA lies in the model for data representation. Classical LDA works with vectorized representations of data, while the 2DLDA algorithm works with data in matrix representation. To further reduce the dimension by 2DLDA, the combination of 2DLDA and classical LDA, namely 2DLDA+LDA, is studied, where LDA is preceded by 2DLDA. The proposed algorithms are applied on face recognition and compared with PCA+LDA. Experiments show that 2DLDA and 2DLDA+LDA achieve competitive recognition accuracy, while being much more efficient.

AB - Linear Discriminant Analysis (LDA) is a well-known scheme for feature extraction and dimension reduction. It has been used widely in many applications involving high-dimensional data, such as face recognition and image retrieval. An intrinsic limitation of classical LDA is the so-called singularity problem, that is, it fails when all scatter matrices are singular. A well-known approach to deal with the singularity problem is to apply an intermediate dimension reduction stage using Principal Component Analysis (PCA) before LDA. The algorithm, called PCA+LDA, is used widely in face recognition. However, PCA+LDA has high costs in time and space, due to the need for an eigen-decomposition involving the scatter matrices. In this paper, we propose a novel LDA algorithm, namely 2DLDA, which stands for 2-Dimensional Linear Discriminant Analysis. 2DLDA overcomes the singularity problem implicitly, while achieving efficiency. The key difference between 2DLDA and classical LDA lies in the model for data representation. Classical LDA works with vectorized representations of data, while the 2DLDA algorithm works with data in matrix representation. To further reduce the dimension by 2DLDA, the combination of 2DLDA and classical LDA, namely 2DLDA+LDA, is studied, where LDA is preceded by 2DLDA. The proposed algorithms are applied on face recognition and compared with PCA+LDA. Experiments show that 2DLDA and 2DLDA+LDA achieve competitive recognition accuracy, while being much more efficient.

UR - http://www.scopus.com/inward/record.url?scp=84898955107&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84898955107&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:84898955107

SN - 0262195348

SN - 9780262195348

BT - Advances in Neural Information Processing Systems

PB - Neural information processing systems foundation

ER -