Discriminant kernel and regularization parameter learning via semidefinite programming

Jieping Ye, Jianhui Chen, Shuiwang Ji

Research output: Chapter in Book/Report/Conference proceedingConference contribution

18 Scopus citations

Abstract

Regularized Kernel Discriminant Analysis (RKDA) performs linear discriminant analysis in the feature space via the kernel trick. The performance of RKDA depends on the selection of kernels. In this paper, we consider the problem of learning an optimal kernel over a convex set of kernels. We show that the kernel learning problem can be formulated as a semidefinite program (SDP) in the binary-class case. We further extend the SDP formulation to the multi-class case. It is based on a key result established in this paper, that is, the multi-class kernel learning problem can be decomposed into a set of binary-class kernel learning problems. In addition, we propose an approximation scheme to reduce the computational complexity of the multi-class SDP formulation. The performance of RKDA also depends on the value of the regularization parameter. We show that this value can be learned automatically in the framework. Experimental results on benchmark data sets demonstrate the efficacy of the proposed SDP formulations.

Original languageEnglish (US)
Title of host publicationACM International Conference Proceeding Series
Pages1095-1102
Number of pages8
Volume227
DOIs
StatePublished - 2007
Event24th International Conference on Machine Learning, ICML 2007 - Corvalis, OR, United States
Duration: Jun 20 2007Jun 24 2007

Other

Other24th International Conference on Machine Learning, ICML 2007
CountryUnited States
CityCorvalis, OR
Period6/20/076/24/07

    Fingerprint

ASJC Scopus subject areas

  • Human-Computer Interaction

Cite this

Ye, J., Chen, J., & Ji, S. (2007). Discriminant kernel and regularization parameter learning via semidefinite programming. In ACM International Conference Proceeding Series (Vol. 227, pp. 1095-1102) https://doi.org/10.1145/1273496.1273634