Discriminant kernel and regularization parameter learning via semidefinite programming

Jieping Ye, Jianhui Chen, Shuiwang Ji

Research output: Chapter in Book/Report/Conference proceedingConference contribution

19 Scopus citations

Abstract

Regularized Kernel Discriminant Analysis (RKDA) performs linear discriminant analysis in the feature space via the kernel trick. The performance of RKDA depends on the selection of kernels. In this paper, we consider the problem of learning an optimal kernel over a convex set of kernels. We show that the kernel learning problem can be formulated as a semidefinite program (SDP) in the binary-class case. We further extend the SDP formulation to the multi-class case. It is based on a key result established in this paper, that is, the multi-class kernel learning problem can be decomposed into a set of binary-class kernel learning problems. In addition, we propose an approximation scheme to reduce the computational complexity of the multi-class SDP formulation. The performance of RKDA also depends on the value of the regularization parameter. We show that this value can be learned automatically in the framework. Experimental results on benchmark data sets demonstrate the efficacy of the proposed SDP formulations.

Original languageEnglish (US)
Title of host publicationACM International Conference Proceeding Series
Pages1095-1102
Number of pages8
Volume227
DOIs
StatePublished - 2007
Event24th International Conference on Machine Learning, ICML 2007 - Corvalis, OR, United States
Duration: Jun 20 2007Jun 24 2007

Other

Other24th International Conference on Machine Learning, ICML 2007
Country/TerritoryUnited States
CityCorvalis, OR
Period6/20/076/24/07

ASJC Scopus subject areas

  • Human-Computer Interaction

Fingerprint

Dive into the research topics of 'Discriminant kernel and regularization parameter learning via semidefinite programming'. Together they form a unique fingerprint.

Cite this