TY - GEN
T1 - Learning the kernel matrix in discriminant analysis via quadratically constrained quadratic programming
AU - Ye, Jieping
AU - Ji, Shuiwang
AU - Chen, Jianhui
PY - 2007
Y1 - 2007
N2 - The kernel function plays a central role in kernel methods. In this paper, we consider the automated learning of the kernel matrix over a convex combination of pre-specified kernel matrices in Regularized Kernel Discriminant Analysis (RKDA), which performs lineardiscriminant analysis in the feature space via the kernel trick. Previous studies have shown that this kernel learning problem can be formulated as a semidefinite program (SDP), which is however computationally expensive, even with the recent advances in interior point methods. Based on the equivalence relationship between RKDA and least square problems in the binary-class case, we propose a Quadratically Constrained Quadratic Programming (QCQP) formulation for the kernel learning problem, which can be solved more efficiently than SDP. While most existing work on kernel learning deal with binary-class problems only, we show that our QCQP formulation can be extended naturally to the multi-class case. Experimental results on both binary-class and multi-class benchmarkdata sets show the efficacy of the proposed QCQP formulations.
AB - The kernel function plays a central role in kernel methods. In this paper, we consider the automated learning of the kernel matrix over a convex combination of pre-specified kernel matrices in Regularized Kernel Discriminant Analysis (RKDA), which performs lineardiscriminant analysis in the feature space via the kernel trick. Previous studies have shown that this kernel learning problem can be formulated as a semidefinite program (SDP), which is however computationally expensive, even with the recent advances in interior point methods. Based on the equivalence relationship between RKDA and least square problems in the binary-class case, we propose a Quadratically Constrained Quadratic Programming (QCQP) formulation for the kernel learning problem, which can be solved more efficiently than SDP. While most existing work on kernel learning deal with binary-class problems only, we show that our QCQP formulation can be extended naturally to the multi-class case. Experimental results on both binary-class and multi-class benchmarkdata sets show the efficacy of the proposed QCQP formulations.
KW - Convex optimization
KW - Kernel discriminant analysis
KW - Kernel learning
KW - Model selection
KW - Quadratically constrained quadratic programming
UR - http://www.scopus.com/inward/record.url?scp=36849008168&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=36849008168&partnerID=8YFLogxK
U2 - 10.1145/1281192.1281283
DO - 10.1145/1281192.1281283
M3 - Conference contribution
AN - SCOPUS:36849008168
SN - 1595936092
SN - 9781595936097
T3 - Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining
SP - 854
EP - 863
BT - KDD-2007
T2 - KDD-2007: 13th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining
Y2 - 12 August 2007 through 15 August 2007
ER -