Null space versus orthogonal Linear Discriminant Analysis

Jieping Ye, Tao Xiong

Research output: Chapter in Book/Report/Conference proceedingConference contribution

20 Citations (Scopus)

Abstract

Dimensionality reduction is an important pre-processing step for many applications. Linear Discriminant Analysis (LDA) is one of the well known methods for supervised dimensionality reduction. However, the classical LDA formulation requires the nonsingularity of scatter matrices involved. For undersampled problems, where the data dimension is much larger than the sample size, all scatter matrices are singular and classical LDA fails. Many extensions, including null space based LDA (NLDA), orthogonal LDA (OLDA), etc, have been proposed in the past to overcome this problem. In this paper, we present a computational and theoretical analysis of NLDA and OLDA. Our main result shows that under a mild condition which holds in many applications involving high-dimensional data, NLDA is equivalent to OLDA. We have performed extensive experiments on various types of data and results are consistent with our theoretical analysis. The presented analysis and experimental results provide further insight into several LDA based algorithms.

Original languageEnglish (US)
Title of host publicationICML 2006 - Proceedings of the 23rd International Conference on Machine Learning
Pages1073-1080
Number of pages8
Volume2006
StatePublished - 2006
EventICML 2006: 23rd International Conference on Machine Learning - Pittsburgh, PA, United States
Duration: Jun 25 2006Jun 29 2006

Other

OtherICML 2006: 23rd International Conference on Machine Learning
CountryUnited States
CityPittsburgh, PA
Period6/25/066/29/06

Fingerprint

Discriminant analysis
Processing
Experiments

ASJC Scopus subject areas

  • Engineering(all)

Cite this

Ye, J., & Xiong, T. (2006). Null space versus orthogonal Linear Discriminant Analysis. In ICML 2006 - Proceedings of the 23rd International Conference on Machine Learning (Vol. 2006, pp. 1073-1080)

Null space versus orthogonal Linear Discriminant Analysis. / Ye, Jieping; Xiong, Tao.

ICML 2006 - Proceedings of the 23rd International Conference on Machine Learning. Vol. 2006 2006. p. 1073-1080.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Ye, J & Xiong, T 2006, Null space versus orthogonal Linear Discriminant Analysis. in ICML 2006 - Proceedings of the 23rd International Conference on Machine Learning. vol. 2006, pp. 1073-1080, ICML 2006: 23rd International Conference on Machine Learning, Pittsburgh, PA, United States, 6/25/06.
Ye J, Xiong T. Null space versus orthogonal Linear Discriminant Analysis. In ICML 2006 - Proceedings of the 23rd International Conference on Machine Learning. Vol. 2006. 2006. p. 1073-1080
Ye, Jieping ; Xiong, Tao. / Null space versus orthogonal Linear Discriminant Analysis. ICML 2006 - Proceedings of the 23rd International Conference on Machine Learning. Vol. 2006 2006. pp. 1073-1080
@inproceedings{9fa8fac1db66429d82f5fbeed2bda082,
title = "Null space versus orthogonal Linear Discriminant Analysis",
abstract = "Dimensionality reduction is an important pre-processing step for many applications. Linear Discriminant Analysis (LDA) is one of the well known methods for supervised dimensionality reduction. However, the classical LDA formulation requires the nonsingularity of scatter matrices involved. For undersampled problems, where the data dimension is much larger than the sample size, all scatter matrices are singular and classical LDA fails. Many extensions, including null space based LDA (NLDA), orthogonal LDA (OLDA), etc, have been proposed in the past to overcome this problem. In this paper, we present a computational and theoretical analysis of NLDA and OLDA. Our main result shows that under a mild condition which holds in many applications involving high-dimensional data, NLDA is equivalent to OLDA. We have performed extensive experiments on various types of data and results are consistent with our theoretical analysis. The presented analysis and experimental results provide further insight into several LDA based algorithms.",
author = "Jieping Ye and Tao Xiong",
year = "2006",
language = "English (US)",
isbn = "1595933832",
volume = "2006",
pages = "1073--1080",
booktitle = "ICML 2006 - Proceedings of the 23rd International Conference on Machine Learning",

}

TY - GEN

T1 - Null space versus orthogonal Linear Discriminant Analysis

AU - Ye, Jieping

AU - Xiong, Tao

PY - 2006

Y1 - 2006

N2 - Dimensionality reduction is an important pre-processing step for many applications. Linear Discriminant Analysis (LDA) is one of the well known methods for supervised dimensionality reduction. However, the classical LDA formulation requires the nonsingularity of scatter matrices involved. For undersampled problems, where the data dimension is much larger than the sample size, all scatter matrices are singular and classical LDA fails. Many extensions, including null space based LDA (NLDA), orthogonal LDA (OLDA), etc, have been proposed in the past to overcome this problem. In this paper, we present a computational and theoretical analysis of NLDA and OLDA. Our main result shows that under a mild condition which holds in many applications involving high-dimensional data, NLDA is equivalent to OLDA. We have performed extensive experiments on various types of data and results are consistent with our theoretical analysis. The presented analysis and experimental results provide further insight into several LDA based algorithms.

AB - Dimensionality reduction is an important pre-processing step for many applications. Linear Discriminant Analysis (LDA) is one of the well known methods for supervised dimensionality reduction. However, the classical LDA formulation requires the nonsingularity of scatter matrices involved. For undersampled problems, where the data dimension is much larger than the sample size, all scatter matrices are singular and classical LDA fails. Many extensions, including null space based LDA (NLDA), orthogonal LDA (OLDA), etc, have been proposed in the past to overcome this problem. In this paper, we present a computational and theoretical analysis of NLDA and OLDA. Our main result shows that under a mild condition which holds in many applications involving high-dimensional data, NLDA is equivalent to OLDA. We have performed extensive experiments on various types of data and results are consistent with our theoretical analysis. The presented analysis and experimental results provide further insight into several LDA based algorithms.

UR - http://www.scopus.com/inward/record.url?scp=33749244971&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=33749244971&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:33749244971

SN - 1595933832

SN - 9781595933836

VL - 2006

SP - 1073

EP - 1080

BT - ICML 2006 - Proceedings of the 23rd International Conference on Machine Learning

ER -