Least squares linear discriminant analysis

Jieping Ye

Research output: Chapter in Book/Report/Conference proceedingConference contribution

232 Scopus citations

Abstract

Linear Discriminant Analysis (LDA) is a well-known method for dimensionality reduction and classification. LDA in the binaryclass case has been shown to be equivalent to linear regression with the class label as the output. This implies that LDA for binary-class classifications can be formulated as a least squares problem. Previous studies have shown certain relationship between multivariate linear regression and LDA for the multi-class case. Many of these studies show that multivariate linear regression with a specific class indicator matrix as the output can be applied as a preprocessing step for LDA. However, directly casting LDA as a least squares problem is challenging for the multi-class case. In this paper, a novel formulation for multivariate linear regression is proposed. The equivalence relationship between the proposed least squares formulation and LDA for multi-class classifications is rigorously established under a mild condition, which is shown empirically to hold in many applications involving high-dimensional data. Several LDA extensions based on the equivalence relationship are discussed.

Original languageEnglish (US)
Title of host publicationACM International Conference Proceeding Series
Pages1087-1093
Number of pages7
Volume227
DOIs
StatePublished - 2007
Event24th International Conference on Machine Learning, ICML 2007 - Corvalis, OR, United States
Duration: Jun 20 2007Jun 24 2007

Other

Other24th International Conference on Machine Learning, ICML 2007
Country/TerritoryUnited States
CityCorvalis, OR
Period6/20/076/24/07

ASJC Scopus subject areas

  • Human-Computer Interaction

Fingerprint

Dive into the research topics of 'Least squares linear discriminant analysis'. Together they form a unique fingerprint.

Cite this