Multi-class discriminant kernel learning via convex programming

Jieping Ye, Shuiwang Ji, Jianhui Chen

Research output: Contribution to journalArticle

86 Citations (Scopus)

Abstract

Regularized kernel discriminant analysis (RKDA) performs linear discriminant analysis in the feature space via the kernel trick. Its performance depends on the selection of kernels. In this paper, we consider the problem of multiple kernel learning (MKL) for RKDA, in which the optimal kernel matrix is obtained as a linear combination of pre-specified kernel matrices. We show that the kernel learning problem in RKDA can be formulated as convex programs. First, we show that this problem can be formulated as a semidefinite program (SDP). Based on the equivalence relationship between RKDA and least square problems in the binary-class case, we propose a convex quadratically constrained quadratic programming (QCQP) formulation for kernel learning in RKDA. A semi-infinite linear programming (SILP) formulation is derived to further improve the efficiency. We extend these formulations to the multi-class case based on a key result established in this paper. That is, the multi-class RKDA kernel learning problem can be decomposed into a set of binary-class kernel learning problems which are constrained to share a common kernel. Based on this decomposition property, SDP formulations are proposed for the multi-class case. Furthermore, it leads naturally to QCQP and SILP formulations. As the performance of RKDA depends on the regularization parameter, we show that this parameter can also be optimized in a joint framework with the kernel. Extensive experiments have been conducted and analyzed, and connections to other algorithms are discussed.

Original languageEnglish (US)
Pages (from-to)719-758
Number of pages40
JournalJournal of Machine Learning Research
Volume9
StatePublished - Apr 2008

Fingerprint

Convex optimization
Convex Programming
Discriminant analysis
Multi-class
Discriminant
kernel
Discriminant Analysis
Quadratic programming
Linear programming
Linear Semi-infinite Programming
Formulation
Semidefinite Program
Learning
Quadratic Programming
Optimal Kernel
Decomposition
Binary
Convex Program
Least Squares Problem
Regularization Parameter

Keywords

  • Kernel discriminant analysis
  • Model selection
  • Quadratically constrained quadratic programming
  • Semi-infinite linear programming
  • Semidefinite programming

ASJC Scopus subject areas

  • Artificial Intelligence

Cite this

Multi-class discriminant kernel learning via convex programming. / Ye, Jieping; Ji, Shuiwang; Chen, Jianhui.

In: Journal of Machine Learning Research, Vol. 9, 04.2008, p. 719-758.

Research output: Contribution to journalArticle

Ye, Jieping ; Ji, Shuiwang ; Chen, Jianhui. / Multi-class discriminant kernel learning via convex programming. In: Journal of Machine Learning Research. 2008 ; Vol. 9. pp. 719-758.
@article{6fe3ddbe613a4132a8071c1924f9e913,
title = "Multi-class discriminant kernel learning via convex programming",
abstract = "Regularized kernel discriminant analysis (RKDA) performs linear discriminant analysis in the feature space via the kernel trick. Its performance depends on the selection of kernels. In this paper, we consider the problem of multiple kernel learning (MKL) for RKDA, in which the optimal kernel matrix is obtained as a linear combination of pre-specified kernel matrices. We show that the kernel learning problem in RKDA can be formulated as convex programs. First, we show that this problem can be formulated as a semidefinite program (SDP). Based on the equivalence relationship between RKDA and least square problems in the binary-class case, we propose a convex quadratically constrained quadratic programming (QCQP) formulation for kernel learning in RKDA. A semi-infinite linear programming (SILP) formulation is derived to further improve the efficiency. We extend these formulations to the multi-class case based on a key result established in this paper. That is, the multi-class RKDA kernel learning problem can be decomposed into a set of binary-class kernel learning problems which are constrained to share a common kernel. Based on this decomposition property, SDP formulations are proposed for the multi-class case. Furthermore, it leads naturally to QCQP and SILP formulations. As the performance of RKDA depends on the regularization parameter, we show that this parameter can also be optimized in a joint framework with the kernel. Extensive experiments have been conducted and analyzed, and connections to other algorithms are discussed.",
keywords = "Kernel discriminant analysis, Model selection, Quadratically constrained quadratic programming, Semi-infinite linear programming, Semidefinite programming",
author = "Jieping Ye and Shuiwang Ji and Jianhui Chen",
year = "2008",
month = "4",
language = "English (US)",
volume = "9",
pages = "719--758",
journal = "Journal of Machine Learning Research",
issn = "1532-4435",
publisher = "Microtome Publishing",

}

TY - JOUR

T1 - Multi-class discriminant kernel learning via convex programming

AU - Ye, Jieping

AU - Ji, Shuiwang

AU - Chen, Jianhui

PY - 2008/4

Y1 - 2008/4

N2 - Regularized kernel discriminant analysis (RKDA) performs linear discriminant analysis in the feature space via the kernel trick. Its performance depends on the selection of kernels. In this paper, we consider the problem of multiple kernel learning (MKL) for RKDA, in which the optimal kernel matrix is obtained as a linear combination of pre-specified kernel matrices. We show that the kernel learning problem in RKDA can be formulated as convex programs. First, we show that this problem can be formulated as a semidefinite program (SDP). Based on the equivalence relationship between RKDA and least square problems in the binary-class case, we propose a convex quadratically constrained quadratic programming (QCQP) formulation for kernel learning in RKDA. A semi-infinite linear programming (SILP) formulation is derived to further improve the efficiency. We extend these formulations to the multi-class case based on a key result established in this paper. That is, the multi-class RKDA kernel learning problem can be decomposed into a set of binary-class kernel learning problems which are constrained to share a common kernel. Based on this decomposition property, SDP formulations are proposed for the multi-class case. Furthermore, it leads naturally to QCQP and SILP formulations. As the performance of RKDA depends on the regularization parameter, we show that this parameter can also be optimized in a joint framework with the kernel. Extensive experiments have been conducted and analyzed, and connections to other algorithms are discussed.

AB - Regularized kernel discriminant analysis (RKDA) performs linear discriminant analysis in the feature space via the kernel trick. Its performance depends on the selection of kernels. In this paper, we consider the problem of multiple kernel learning (MKL) for RKDA, in which the optimal kernel matrix is obtained as a linear combination of pre-specified kernel matrices. We show that the kernel learning problem in RKDA can be formulated as convex programs. First, we show that this problem can be formulated as a semidefinite program (SDP). Based on the equivalence relationship between RKDA and least square problems in the binary-class case, we propose a convex quadratically constrained quadratic programming (QCQP) formulation for kernel learning in RKDA. A semi-infinite linear programming (SILP) formulation is derived to further improve the efficiency. We extend these formulations to the multi-class case based on a key result established in this paper. That is, the multi-class RKDA kernel learning problem can be decomposed into a set of binary-class kernel learning problems which are constrained to share a common kernel. Based on this decomposition property, SDP formulations are proposed for the multi-class case. Furthermore, it leads naturally to QCQP and SILP formulations. As the performance of RKDA depends on the regularization parameter, we show that this parameter can also be optimized in a joint framework with the kernel. Extensive experiments have been conducted and analyzed, and connections to other algorithms are discussed.

KW - Kernel discriminant analysis

KW - Model selection

KW - Quadratically constrained quadratic programming

KW - Semi-infinite linear programming

KW - Semidefinite programming

UR - http://www.scopus.com/inward/record.url?scp=44649123652&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=44649123652&partnerID=8YFLogxK

M3 - Article

AN - SCOPUS:44649123652

VL - 9

SP - 719

EP - 758

JO - Journal of Machine Learning Research

JF - Journal of Machine Learning Research

SN - 1532-4435

ER -