Learning the optimal neighborhood kernel for classification

Jun Liu, Jianhui Chen, Songcan Chen, Jieping Ye

Research output: Chapter in Book/Report/Conference proceedingConference contribution

13 Scopus citations


Kernel methods have been applied successfully in many applications. The kernel matrix plays an important role in kernel-based learning methods, but the "ideal" kernel matrix is usually unknown in practice and needs to be estimated. In this paper,we propose to directly learn the "ideal" kernel matrix (called the optimal neighborhood kernel matrix) from a pre-specified kernel matrix for improved classification performance. We assume that the pre-specified kernel matrix generated from the specific application is a noisy observation of the ideal one. The resulting optimal neighborhood kernel matrix is shown to be the summation of the pre-specified kernel matrix and a rank-one matrix. We formulate the problem of learning the optimal neighborhood kernel as a constrained quartic problem, and propose to solve it using two methods: level method and constrained gradient descent. Empirical results on several benchmark data sets demonstrate the efficiency and effectiveness of the proposed algorithms.

Original languageEnglish (US)
Title of host publicationIJCAI-09 - Proceedings of the 21st International Joint Conference on Artificial Intelligence
PublisherInternational Joint Conferences on Artificial Intelligence
Number of pages6
ISBN (Print)9781577354260
StatePublished - 2009
Event21st International Joint Conference on Artificial Intelligence, IJCAI 2009 - Pasadena, United States
Duration: Jul 11 2009Jul 16 2009

Publication series

NameIJCAI International Joint Conference on Artificial Intelligence
ISSN (Print)1045-0823


Conference21st International Joint Conference on Artificial Intelligence, IJCAI 2009
Country/TerritoryUnited States

ASJC Scopus subject areas

  • Artificial Intelligence

Cite this