An efficient algorithm for a class of fused Lasso problems

Jun Liu, Lei Yuan, Jieping Ye

Research output: Chapter in Book/Report/Conference proceedingConference contribution

106 Citations (Scopus)

Abstract

The fused Lasso penalty enforces sparsity in both the coefficients and their successive differences, which is desirable for applications with features ordered in some meaningful way. The resulting problem is, however, challenging to solve, as the fused Lasso penalty is both non-smooth and non-separable. Existing algorithms have high computational complexity and do not scale to large-size problems. In this paper, we propose an Efficient Fused Lasso Algorithm (EFLA) for optimizing this class of problems. One key building block in the proposed EFLA is the Fused Lasso Signal Approximator (FLSA). To efficiently solve FLSA, we propose to reformulate it as the problem of finding an "appropriate" subgradient of the fused penalty at the minimizer, and develop a Subgradient Finding Algorithm (SFA). We further design a restart technique to accelerate the convergence of SFA, by exploiting the special "structures" of both the original and the reformulated FLSA problems. Our empirical evaluations show that, both SFA and EFLA significantly outperform existing solvers. We also demonstrate several applications of the fused Lasso.

Original languageEnglish (US)
Title of host publicationProceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining
Pages323-332
Number of pages10
DOIs
StatePublished - 2010
Event16th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD-2010 - Washington, DC, United States
Duration: Jul 25 2010Jul 28 2010

Other

Other16th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD-2010
CountryUnited States
CityWashington, DC
Period7/25/107/28/10

Fingerprint

Computational complexity

Keywords

  • ℓ regularization
  • Fused Lasso
  • Restart
  • Subgradient

ASJC Scopus subject areas

  • Software
  • Information Systems

Cite this

Liu, J., Yuan, L., & Ye, J. (2010). An efficient algorithm for a class of fused Lasso problems. In Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (pp. 323-332) https://doi.org/10.1145/1835804.1835847

An efficient algorithm for a class of fused Lasso problems. / Liu, Jun; Yuan, Lei; Ye, Jieping.

Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. 2010. p. 323-332.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Liu, J, Yuan, L & Ye, J 2010, An efficient algorithm for a class of fused Lasso problems. in Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. pp. 323-332, 16th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD-2010, Washington, DC, United States, 7/25/10. https://doi.org/10.1145/1835804.1835847
Liu J, Yuan L, Ye J. An efficient algorithm for a class of fused Lasso problems. In Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. 2010. p. 323-332 https://doi.org/10.1145/1835804.1835847
Liu, Jun ; Yuan, Lei ; Ye, Jieping. / An efficient algorithm for a class of fused Lasso problems. Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. 2010. pp. 323-332
@inproceedings{0f05f2fd303741c6847f48c1ccff19d0,
title = "An efficient algorithm for a class of fused Lasso problems",
abstract = "The fused Lasso penalty enforces sparsity in both the coefficients and their successive differences, which is desirable for applications with features ordered in some meaningful way. The resulting problem is, however, challenging to solve, as the fused Lasso penalty is both non-smooth and non-separable. Existing algorithms have high computational complexity and do not scale to large-size problems. In this paper, we propose an Efficient Fused Lasso Algorithm (EFLA) for optimizing this class of problems. One key building block in the proposed EFLA is the Fused Lasso Signal Approximator (FLSA). To efficiently solve FLSA, we propose to reformulate it as the problem of finding an {"}appropriate{"} subgradient of the fused penalty at the minimizer, and develop a Subgradient Finding Algorithm (SFA). We further design a restart technique to accelerate the convergence of SFA, by exploiting the special {"}structures{"} of both the original and the reformulated FLSA problems. Our empirical evaluations show that, both SFA and EFLA significantly outperform existing solvers. We also demonstrate several applications of the fused Lasso.",
keywords = "ℓ regularization, Fused Lasso, Restart, Subgradient",
author = "Jun Liu and Lei Yuan and Jieping Ye",
year = "2010",
doi = "10.1145/1835804.1835847",
language = "English (US)",
isbn = "9781450300551",
pages = "323--332",
booktitle = "Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining",

}

TY - GEN

T1 - An efficient algorithm for a class of fused Lasso problems

AU - Liu, Jun

AU - Yuan, Lei

AU - Ye, Jieping

PY - 2010

Y1 - 2010

N2 - The fused Lasso penalty enforces sparsity in both the coefficients and their successive differences, which is desirable for applications with features ordered in some meaningful way. The resulting problem is, however, challenging to solve, as the fused Lasso penalty is both non-smooth and non-separable. Existing algorithms have high computational complexity and do not scale to large-size problems. In this paper, we propose an Efficient Fused Lasso Algorithm (EFLA) for optimizing this class of problems. One key building block in the proposed EFLA is the Fused Lasso Signal Approximator (FLSA). To efficiently solve FLSA, we propose to reformulate it as the problem of finding an "appropriate" subgradient of the fused penalty at the minimizer, and develop a Subgradient Finding Algorithm (SFA). We further design a restart technique to accelerate the convergence of SFA, by exploiting the special "structures" of both the original and the reformulated FLSA problems. Our empirical evaluations show that, both SFA and EFLA significantly outperform existing solvers. We also demonstrate several applications of the fused Lasso.

AB - The fused Lasso penalty enforces sparsity in both the coefficients and their successive differences, which is desirable for applications with features ordered in some meaningful way. The resulting problem is, however, challenging to solve, as the fused Lasso penalty is both non-smooth and non-separable. Existing algorithms have high computational complexity and do not scale to large-size problems. In this paper, we propose an Efficient Fused Lasso Algorithm (EFLA) for optimizing this class of problems. One key building block in the proposed EFLA is the Fused Lasso Signal Approximator (FLSA). To efficiently solve FLSA, we propose to reformulate it as the problem of finding an "appropriate" subgradient of the fused penalty at the minimizer, and develop a Subgradient Finding Algorithm (SFA). We further design a restart technique to accelerate the convergence of SFA, by exploiting the special "structures" of both the original and the reformulated FLSA problems. Our empirical evaluations show that, both SFA and EFLA significantly outperform existing solvers. We also demonstrate several applications of the fused Lasso.

KW - ℓ regularization

KW - Fused Lasso

KW - Restart

KW - Subgradient

UR - http://www.scopus.com/inward/record.url?scp=77956206508&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=77956206508&partnerID=8YFLogxK

U2 - 10.1145/1835804.1835847

DO - 10.1145/1835804.1835847

M3 - Conference contribution

AN - SCOPUS:77956206508

SN - 9781450300551

SP - 323

EP - 332

BT - Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining

ER -