A least squares formulation for a class of generalized eigenvalue problems in machine learning

Liang Sun, Shuiwang Ji, Jieping Ye

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

Many machine learning algorithms can be formulated as a generalized eigenvalue problem. One major limitation of such formulation is that the generalized eigenvalue problem is computationally expensive to solve especially for large-scale problems. In this paper, we show that under a mild condition, a class of generalized eigenvalue problems in machine learning can be formulated as a least squares problem. This class of problems include classical techniques such as Canonical Correlation Analysis (CCA), Partial Least Squares (PLS), and Linear Discriminant Analysis (LDA), as well as Hypergraph Spectral Learning (HSL). As a result, various regularization techniques can be readily incorporated into the formulation to improve model sparsity and generalization ability. In addition, the least squares formulation leads to efficient and scalable implementations based on the iterative conjugate gradient type algorithms. We report experimental results that confirm the established equivalence relationship. Results also demonstrate the efficiency and effectiveness of the equivalent least squares formulations on large-scale problems.

Original languageEnglish (US)
Title of host publicationACM International Conference Proceeding Series
Volume382
DOIs
StatePublished - 2009
Event26th Annual International Conference on Machine Learning, ICML'09 - Montreal, QC, Canada
Duration: Jun 14 2009Jun 18 2009

Other

Other26th Annual International Conference on Machine Learning, ICML'09
CountryCanada
CityMontreal, QC
Period6/14/096/18/09

Fingerprint

Learning systems
Discriminant analysis
Learning algorithms

ASJC Scopus subject areas

  • Human-Computer Interaction

Cite this

Sun, L., Ji, S., & Ye, J. (2009). A least squares formulation for a class of generalized eigenvalue problems in machine learning. In ACM International Conference Proceeding Series (Vol. 382). [122] https://doi.org/10.1145/1553374.1553499

A least squares formulation for a class of generalized eigenvalue problems in machine learning. / Sun, Liang; Ji, Shuiwang; Ye, Jieping.

ACM International Conference Proceeding Series. Vol. 382 2009. 122.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Sun, L, Ji, S & Ye, J 2009, A least squares formulation for a class of generalized eigenvalue problems in machine learning. in ACM International Conference Proceeding Series. vol. 382, 122, 26th Annual International Conference on Machine Learning, ICML'09, Montreal, QC, Canada, 6/14/09. https://doi.org/10.1145/1553374.1553499
Sun L, Ji S, Ye J. A least squares formulation for a class of generalized eigenvalue problems in machine learning. In ACM International Conference Proceeding Series. Vol. 382. 2009. 122 https://doi.org/10.1145/1553374.1553499
Sun, Liang ; Ji, Shuiwang ; Ye, Jieping. / A least squares formulation for a class of generalized eigenvalue problems in machine learning. ACM International Conference Proceeding Series. Vol. 382 2009.
@inproceedings{943b8cabc754442995041021f39a7527,
title = "A least squares formulation for a class of generalized eigenvalue problems in machine learning",
abstract = "Many machine learning algorithms can be formulated as a generalized eigenvalue problem. One major limitation of such formulation is that the generalized eigenvalue problem is computationally expensive to solve especially for large-scale problems. In this paper, we show that under a mild condition, a class of generalized eigenvalue problems in machine learning can be formulated as a least squares problem. This class of problems include classical techniques such as Canonical Correlation Analysis (CCA), Partial Least Squares (PLS), and Linear Discriminant Analysis (LDA), as well as Hypergraph Spectral Learning (HSL). As a result, various regularization techniques can be readily incorporated into the formulation to improve model sparsity and generalization ability. In addition, the least squares formulation leads to efficient and scalable implementations based on the iterative conjugate gradient type algorithms. We report experimental results that confirm the established equivalence relationship. Results also demonstrate the efficiency and effectiveness of the equivalent least squares formulations on large-scale problems.",
author = "Liang Sun and Shuiwang Ji and Jieping Ye",
year = "2009",
doi = "10.1145/1553374.1553499",
language = "English (US)",
isbn = "9781605585161",
volume = "382",
booktitle = "ACM International Conference Proceeding Series",

}

TY - GEN

T1 - A least squares formulation for a class of generalized eigenvalue problems in machine learning

AU - Sun, Liang

AU - Ji, Shuiwang

AU - Ye, Jieping

PY - 2009

Y1 - 2009

N2 - Many machine learning algorithms can be formulated as a generalized eigenvalue problem. One major limitation of such formulation is that the generalized eigenvalue problem is computationally expensive to solve especially for large-scale problems. In this paper, we show that under a mild condition, a class of generalized eigenvalue problems in machine learning can be formulated as a least squares problem. This class of problems include classical techniques such as Canonical Correlation Analysis (CCA), Partial Least Squares (PLS), and Linear Discriminant Analysis (LDA), as well as Hypergraph Spectral Learning (HSL). As a result, various regularization techniques can be readily incorporated into the formulation to improve model sparsity and generalization ability. In addition, the least squares formulation leads to efficient and scalable implementations based on the iterative conjugate gradient type algorithms. We report experimental results that confirm the established equivalence relationship. Results also demonstrate the efficiency and effectiveness of the equivalent least squares formulations on large-scale problems.

AB - Many machine learning algorithms can be formulated as a generalized eigenvalue problem. One major limitation of such formulation is that the generalized eigenvalue problem is computationally expensive to solve especially for large-scale problems. In this paper, we show that under a mild condition, a class of generalized eigenvalue problems in machine learning can be formulated as a least squares problem. This class of problems include classical techniques such as Canonical Correlation Analysis (CCA), Partial Least Squares (PLS), and Linear Discriminant Analysis (LDA), as well as Hypergraph Spectral Learning (HSL). As a result, various regularization techniques can be readily incorporated into the formulation to improve model sparsity and generalization ability. In addition, the least squares formulation leads to efficient and scalable implementations based on the iterative conjugate gradient type algorithms. We report experimental results that confirm the established equivalence relationship. Results also demonstrate the efficiency and effectiveness of the equivalent least squares formulations on large-scale problems.

UR - http://www.scopus.com/inward/record.url?scp=70049097964&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=70049097964&partnerID=8YFLogxK

U2 - 10.1145/1553374.1553499

DO - 10.1145/1553374.1553499

M3 - Conference contribution

SN - 9781605585161

VL - 382

BT - ACM International Conference Proceeding Series

ER -