Learning the optimal neighborhood kernel for classification

Jun Liu, Jianhui Chen, Songcan Chen, Jieping Ye

Research output: Chapter in Book/Report/Conference proceedingConference contribution

11 Citations (Scopus)

Abstract

Kernel methods have been applied successfully in many applications. The kernel matrix plays an important role in kernel-based learning methods, but the "ideal" kernel matrix is usually unknown in practice and needs to be estimated. In this paper,we propose to directly learn the "ideal" kernel matrix (called the optimal neighborhood kernel matrix) from a pre-specified kernel matrix for improved classification performance. We assume that the pre-specified kernel matrix generated from the specific application is a noisy observation of the ideal one. The resulting optimal neighborhood kernel matrix is shown to be the summation of the pre-specified kernel matrix and a rank-one matrix. We formulate the problem of learning the optimal neighborhood kernel as a constrained quartic problem, and propose to solve it using two methods: level method and constrained gradient descent. Empirical results on several benchmark data sets demonstrate the efficiency and effectiveness of the proposed algorithms.

Original languageEnglish (US)
Title of host publicationIJCAI International Joint Conference on Artificial Intelligence
Pages1144-1149
Number of pages6
StatePublished - 2009
Event21st International Joint Conference on Artificial Intelligence, IJCAI-09 - Pasadena, CA, United States
Duration: Jul 11 2009Jul 17 2009

Other

Other21st International Joint Conference on Artificial Intelligence, IJCAI-09
CountryUnited States
CityPasadena, CA
Period7/11/097/17/09

ASJC Scopus subject areas

  • Artificial Intelligence

Cite this

Liu, J., Chen, J., Chen, S., & Ye, J. (2009). Learning the optimal neighborhood kernel for classification. In IJCAI International Joint Conference on Artificial Intelligence (pp. 1144-1149)

Learning the optimal neighborhood kernel for classification. / Liu, Jun; Chen, Jianhui; Chen, Songcan; Ye, Jieping.

IJCAI International Joint Conference on Artificial Intelligence. 2009. p. 1144-1149.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Liu, J, Chen, J, Chen, S & Ye, J 2009, Learning the optimal neighborhood kernel for classification. in IJCAI International Joint Conference on Artificial Intelligence. pp. 1144-1149, 21st International Joint Conference on Artificial Intelligence, IJCAI-09, Pasadena, CA, United States, 7/11/09.
Liu J, Chen J, Chen S, Ye J. Learning the optimal neighborhood kernel for classification. In IJCAI International Joint Conference on Artificial Intelligence. 2009. p. 1144-1149
Liu, Jun ; Chen, Jianhui ; Chen, Songcan ; Ye, Jieping. / Learning the optimal neighborhood kernel for classification. IJCAI International Joint Conference on Artificial Intelligence. 2009. pp. 1144-1149
@inproceedings{a67b391afb0d48d1b8103680df2706ba,
title = "Learning the optimal neighborhood kernel for classification",
abstract = "Kernel methods have been applied successfully in many applications. The kernel matrix plays an important role in kernel-based learning methods, but the {"}ideal{"} kernel matrix is usually unknown in practice and needs to be estimated. In this paper,we propose to directly learn the {"}ideal{"} kernel matrix (called the optimal neighborhood kernel matrix) from a pre-specified kernel matrix for improved classification performance. We assume that the pre-specified kernel matrix generated from the specific application is a noisy observation of the ideal one. The resulting optimal neighborhood kernel matrix is shown to be the summation of the pre-specified kernel matrix and a rank-one matrix. We formulate the problem of learning the optimal neighborhood kernel as a constrained quartic problem, and propose to solve it using two methods: level method and constrained gradient descent. Empirical results on several benchmark data sets demonstrate the efficiency and effectiveness of the proposed algorithms.",
author = "Jun Liu and Jianhui Chen and Songcan Chen and Jieping Ye",
year = "2009",
language = "English (US)",
isbn = "9781577354260",
pages = "1144--1149",
booktitle = "IJCAI International Joint Conference on Artificial Intelligence",

}

TY - GEN

T1 - Learning the optimal neighborhood kernel for classification

AU - Liu, Jun

AU - Chen, Jianhui

AU - Chen, Songcan

AU - Ye, Jieping

PY - 2009

Y1 - 2009

N2 - Kernel methods have been applied successfully in many applications. The kernel matrix plays an important role in kernel-based learning methods, but the "ideal" kernel matrix is usually unknown in practice and needs to be estimated. In this paper,we propose to directly learn the "ideal" kernel matrix (called the optimal neighborhood kernel matrix) from a pre-specified kernel matrix for improved classification performance. We assume that the pre-specified kernel matrix generated from the specific application is a noisy observation of the ideal one. The resulting optimal neighborhood kernel matrix is shown to be the summation of the pre-specified kernel matrix and a rank-one matrix. We formulate the problem of learning the optimal neighborhood kernel as a constrained quartic problem, and propose to solve it using two methods: level method and constrained gradient descent. Empirical results on several benchmark data sets demonstrate the efficiency and effectiveness of the proposed algorithms.

AB - Kernel methods have been applied successfully in many applications. The kernel matrix plays an important role in kernel-based learning methods, but the "ideal" kernel matrix is usually unknown in practice and needs to be estimated. In this paper,we propose to directly learn the "ideal" kernel matrix (called the optimal neighborhood kernel matrix) from a pre-specified kernel matrix for improved classification performance. We assume that the pre-specified kernel matrix generated from the specific application is a noisy observation of the ideal one. The resulting optimal neighborhood kernel matrix is shown to be the summation of the pre-specified kernel matrix and a rank-one matrix. We formulate the problem of learning the optimal neighborhood kernel as a constrained quartic problem, and propose to solve it using two methods: level method and constrained gradient descent. Empirical results on several benchmark data sets demonstrate the efficiency and effectiveness of the proposed algorithms.

UR - http://www.scopus.com/inward/record.url?scp=77958559950&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=77958559950&partnerID=8YFLogxK

M3 - Conference contribution

SN - 9781577354260

SP - 1144

EP - 1149

BT - IJCAI International Joint Conference on Artificial Intelligence

ER -