A least squares formulation for a class of generalized eigenvalue problems in machine learning

Liang Sun, Shuiwang Ji, Jieping Ye

Research output: Chapter in Book/Report/Conference proceedingConference contribution

26 Scopus citations

Abstract

Many machine learning algorithms can be formulated as a generalized eigenvalue problem. One major limitation of such formulation is that the generalized eigenvalue problem is computationally expensive to solve especially for large-scale problems. In this paper, we show that under a mild condition, a class of generalized eigenvalue problems in machine learning can be formulated as a least squares problem. This class of problems include classical techniques such as Canonical Correlation Analysis (CCA), Partial Least Squares (PLS), and Linear Discriminant Analysis (LDA), as well as Hypergraph Spectral Learning (HSL). As a result, various regularization techniques can be readily incorporated into the formulation to improve model sparsity and generalization ability. In addition, the least squares formulation leads to efficient and scalable implementations based on the iterative conjugate gradient type algorithms. We report experimental results that confirm the established equivalence relationship. Results also demonstrate the efficiency and effectiveness of the equivalent least squares formulations on large-scale problems.

Original languageEnglish (US)
Title of host publicationProceedings of the 26th International Conference On Machine Learning, ICML 2009
Pages977-984
Number of pages8
StatePublished - Dec 9 2009
Event26th International Conference On Machine Learning, ICML 2009 - Montreal, QC, Canada
Duration: Jun 14 2009Jun 18 2009

Publication series

NameProceedings of the 26th International Conference On Machine Learning, ICML 2009

Other

Other26th International Conference On Machine Learning, ICML 2009
CountryCanada
CityMontreal, QC
Period6/14/096/18/09

ASJC Scopus subject areas

  • Artificial Intelligence
  • Computer Networks and Communications
  • Software

Fingerprint Dive into the research topics of 'A least squares formulation for a class of generalized eigenvalue problems in machine learning'. Together they form a unique fingerprint.

  • Cite this

    Sun, L., Ji, S., & Ye, J. (2009). A least squares formulation for a class of generalized eigenvalue problems in machine learning. In Proceedings of the 26th International Conference On Machine Learning, ICML 2009 (pp. 977-984). (Proceedings of the 26th International Conference On Machine Learning, ICML 2009).