Kernel uncorrelated and regularized discriminant analysis

A theoretical and computational study

Shuiwang Ji, Jieping Ye

Research output: Contribution to journalArticle

14 Citations (Scopus)

Abstract

Linear and kernel discriminant analyses are popular approaches for supervised dimensionality reduction. Uncorrelated and regularized discriminant analyses have been proposed to overcome the singularity problem encountered by classical discriminant analysis. In this paper, we study the properties of kernel uncorrelated and regularized discriminant analyses, called KUDA and KRDA, respectively. In particular, we show that under a mild condition, both linear and kernel uncorrelated discriminant analysis project samples in the same class to a common vector in the dimensionality-reduced space. This implies that uncorrelated discriminant analysis may suffer from the overfitting problem if there are a large number of samples in each class. We show that as the regularization parameter in KRDA tends to zero, KRDA approaches KUDA. This shows that KUDA is a special case of KRDA and that regularization can be applied to overcome the overfitting problem in uncorrelated discriminant analysis. As the performance of KRDA depends on the value of the regularization parameter, we show that the matrix computations involved in KRDA can be simplified, so that a large number of candidate values can be cross-validated efficiently. Finally, we conduct experiments to evaluate the proposed theories and algorithms.

Original languageEnglish (US)
Article number4479466
Pages (from-to)1311-1321
Number of pages11
JournalIEEE Transactions on Knowledge and Data Engineering
Volume20
Issue number10
DOIs
StatePublished - Oct 2008

Fingerprint

Discriminant analysis
Experiments

Keywords

  • Discriminant analysis
  • Kernel methods
  • Model selection
  • Regularization
  • Singular value decomposition

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Electrical and Electronic Engineering
  • Artificial Intelligence
  • Information Systems

Cite this

Kernel uncorrelated and regularized discriminant analysis : A theoretical and computational study. / Ji, Shuiwang; Ye, Jieping.

In: IEEE Transactions on Knowledge and Data Engineering, Vol. 20, No. 10, 4479466, 10.2008, p. 1311-1321.

Research output: Contribution to journalArticle

@article{58edc70d73474940a338bf196b26a0dc,
title = "Kernel uncorrelated and regularized discriminant analysis: A theoretical and computational study",
abstract = "Linear and kernel discriminant analyses are popular approaches for supervised dimensionality reduction. Uncorrelated and regularized discriminant analyses have been proposed to overcome the singularity problem encountered by classical discriminant analysis. In this paper, we study the properties of kernel uncorrelated and regularized discriminant analyses, called KUDA and KRDA, respectively. In particular, we show that under a mild condition, both linear and kernel uncorrelated discriminant analysis project samples in the same class to a common vector in the dimensionality-reduced space. This implies that uncorrelated discriminant analysis may suffer from the overfitting problem if there are a large number of samples in each class. We show that as the regularization parameter in KRDA tends to zero, KRDA approaches KUDA. This shows that KUDA is a special case of KRDA and that regularization can be applied to overcome the overfitting problem in uncorrelated discriminant analysis. As the performance of KRDA depends on the value of the regularization parameter, we show that the matrix computations involved in KRDA can be simplified, so that a large number of candidate values can be cross-validated efficiently. Finally, we conduct experiments to evaluate the proposed theories and algorithms.",
keywords = "Discriminant analysis, Kernel methods, Model selection, Regularization, Singular value decomposition",
author = "Shuiwang Ji and Jieping Ye",
year = "2008",
month = "10",
doi = "10.1109/TKDE.2008.57",
language = "English (US)",
volume = "20",
pages = "1311--1321",
journal = "IEEE Transactions on Knowledge and Data Engineering",
issn = "1041-4347",
publisher = "IEEE Computer Society",
number = "10",

}

TY - JOUR

T1 - Kernel uncorrelated and regularized discriminant analysis

T2 - A theoretical and computational study

AU - Ji, Shuiwang

AU - Ye, Jieping

PY - 2008/10

Y1 - 2008/10

N2 - Linear and kernel discriminant analyses are popular approaches for supervised dimensionality reduction. Uncorrelated and regularized discriminant analyses have been proposed to overcome the singularity problem encountered by classical discriminant analysis. In this paper, we study the properties of kernel uncorrelated and regularized discriminant analyses, called KUDA and KRDA, respectively. In particular, we show that under a mild condition, both linear and kernel uncorrelated discriminant analysis project samples in the same class to a common vector in the dimensionality-reduced space. This implies that uncorrelated discriminant analysis may suffer from the overfitting problem if there are a large number of samples in each class. We show that as the regularization parameter in KRDA tends to zero, KRDA approaches KUDA. This shows that KUDA is a special case of KRDA and that regularization can be applied to overcome the overfitting problem in uncorrelated discriminant analysis. As the performance of KRDA depends on the value of the regularization parameter, we show that the matrix computations involved in KRDA can be simplified, so that a large number of candidate values can be cross-validated efficiently. Finally, we conduct experiments to evaluate the proposed theories and algorithms.

AB - Linear and kernel discriminant analyses are popular approaches for supervised dimensionality reduction. Uncorrelated and regularized discriminant analyses have been proposed to overcome the singularity problem encountered by classical discriminant analysis. In this paper, we study the properties of kernel uncorrelated and regularized discriminant analyses, called KUDA and KRDA, respectively. In particular, we show that under a mild condition, both linear and kernel uncorrelated discriminant analysis project samples in the same class to a common vector in the dimensionality-reduced space. This implies that uncorrelated discriminant analysis may suffer from the overfitting problem if there are a large number of samples in each class. We show that as the regularization parameter in KRDA tends to zero, KRDA approaches KUDA. This shows that KUDA is a special case of KRDA and that regularization can be applied to overcome the overfitting problem in uncorrelated discriminant analysis. As the performance of KRDA depends on the value of the regularization parameter, we show that the matrix computations involved in KRDA can be simplified, so that a large number of candidate values can be cross-validated efficiently. Finally, we conduct experiments to evaluate the proposed theories and algorithms.

KW - Discriminant analysis

KW - Kernel methods

KW - Model selection

KW - Regularization

KW - Singular value decomposition

UR - http://www.scopus.com/inward/record.url?scp=50649093539&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=50649093539&partnerID=8YFLogxK

U2 - 10.1109/TKDE.2008.57

DO - 10.1109/TKDE.2008.57

M3 - Article

VL - 20

SP - 1311

EP - 1321

JO - IEEE Transactions on Knowledge and Data Engineering

JF - IEEE Transactions on Knowledge and Data Engineering

SN - 1041-4347

IS - 10

M1 - 4479466

ER -