Integrating low-rank and group-sparse structures for robust multi-task learning

Jianhui Chen, Jiayu Zhou, Jieping Ye

Research output: Chapter in Book/Report/Conference proceedingConference contribution

160 Citations (Scopus)

Abstract

Multi-task learning (MTL) aims at improving the generalization performance by utilizing the intrinsic relationships among multiple related tasks. A key assumption in most MTL algorithms is that all tasks are related, which, however, may not be the case in many real-world applications. In this paper, we propose a robust multi-task learning (RMTL) algorithm which learns multiple tasks simultaneously as well as identifies the irrelevant (outlier) tasks. Specifically, the proposed RMTL algorithm captures the task relationships using a low-rank structure, and simultaneously identifies the outlier tasks using a group-sparse structure. The proposed RMTL algorithm is formulated as a non-smooth convex (unconstrained) optimization problem. We propose to adopt the accelerated proximal method (APM) for solving such an optimization problem. The key component in APM is the computation of the proximal operator, which can be shown to admit an analytic solution. We also theoretically analyze the effectiveness of the RMTL algorithm. In particular, we derive a key property of the optimal solution to RMTL; moreover, based on this key property, we establish a theoretical bound for characterizing the learning performance of RMTL. Our experimental results on benchmark data sets demonstrate the effectiveness and efficiency of the proposed algorithm.

Original languageEnglish (US)
Title of host publicationProceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining
Pages42-50
Number of pages9
DOIs
StatePublished - 2011
Event17th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD'11 - San Diego, CA, United States
Duration: Aug 21 2011Aug 24 2011

Other

Other17th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD'11
CountryUnited States
CitySan Diego, CA
Period8/21/118/24/11

Fingerprint

Learning algorithms
Convex optimization
Mathematical operators

Keywords

  • Group-sparsity
  • Low-rank patterns
  • Multi-task learning
  • Robust

ASJC Scopus subject areas

  • Software
  • Information Systems

Cite this

Chen, J., Zhou, J., & Ye, J. (2011). Integrating low-rank and group-sparse structures for robust multi-task learning. In Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (pp. 42-50) https://doi.org/10.1145/2020408.2020423

Integrating low-rank and group-sparse structures for robust multi-task learning. / Chen, Jianhui; Zhou, Jiayu; Ye, Jieping.

Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. 2011. p. 42-50.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Chen, J, Zhou, J & Ye, J 2011, Integrating low-rank and group-sparse structures for robust multi-task learning. in Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. pp. 42-50, 17th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD'11, San Diego, CA, United States, 8/21/11. https://doi.org/10.1145/2020408.2020423
Chen J, Zhou J, Ye J. Integrating low-rank and group-sparse structures for robust multi-task learning. In Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. 2011. p. 42-50 https://doi.org/10.1145/2020408.2020423
Chen, Jianhui ; Zhou, Jiayu ; Ye, Jieping. / Integrating low-rank and group-sparse structures for robust multi-task learning. Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. 2011. pp. 42-50
@inproceedings{7b95ad7a9af8413eab6e1c2b693f4b95,
title = "Integrating low-rank and group-sparse structures for robust multi-task learning",
abstract = "Multi-task learning (MTL) aims at improving the generalization performance by utilizing the intrinsic relationships among multiple related tasks. A key assumption in most MTL algorithms is that all tasks are related, which, however, may not be the case in many real-world applications. In this paper, we propose a robust multi-task learning (RMTL) algorithm which learns multiple tasks simultaneously as well as identifies the irrelevant (outlier) tasks. Specifically, the proposed RMTL algorithm captures the task relationships using a low-rank structure, and simultaneously identifies the outlier tasks using a group-sparse structure. The proposed RMTL algorithm is formulated as a non-smooth convex (unconstrained) optimization problem. We propose to adopt the accelerated proximal method (APM) for solving such an optimization problem. The key component in APM is the computation of the proximal operator, which can be shown to admit an analytic solution. We also theoretically analyze the effectiveness of the RMTL algorithm. In particular, we derive a key property of the optimal solution to RMTL; moreover, based on this key property, we establish a theoretical bound for characterizing the learning performance of RMTL. Our experimental results on benchmark data sets demonstrate the effectiveness and efficiency of the proposed algorithm.",
keywords = "Group-sparsity, Low-rank patterns, Multi-task learning, Robust",
author = "Jianhui Chen and Jiayu Zhou and Jieping Ye",
year = "2011",
doi = "10.1145/2020408.2020423",
language = "English (US)",
isbn = "9781450308137",
pages = "42--50",
booktitle = "Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining",

}

TY - GEN

T1 - Integrating low-rank and group-sparse structures for robust multi-task learning

AU - Chen, Jianhui

AU - Zhou, Jiayu

AU - Ye, Jieping

PY - 2011

Y1 - 2011

N2 - Multi-task learning (MTL) aims at improving the generalization performance by utilizing the intrinsic relationships among multiple related tasks. A key assumption in most MTL algorithms is that all tasks are related, which, however, may not be the case in many real-world applications. In this paper, we propose a robust multi-task learning (RMTL) algorithm which learns multiple tasks simultaneously as well as identifies the irrelevant (outlier) tasks. Specifically, the proposed RMTL algorithm captures the task relationships using a low-rank structure, and simultaneously identifies the outlier tasks using a group-sparse structure. The proposed RMTL algorithm is formulated as a non-smooth convex (unconstrained) optimization problem. We propose to adopt the accelerated proximal method (APM) for solving such an optimization problem. The key component in APM is the computation of the proximal operator, which can be shown to admit an analytic solution. We also theoretically analyze the effectiveness of the RMTL algorithm. In particular, we derive a key property of the optimal solution to RMTL; moreover, based on this key property, we establish a theoretical bound for characterizing the learning performance of RMTL. Our experimental results on benchmark data sets demonstrate the effectiveness and efficiency of the proposed algorithm.

AB - Multi-task learning (MTL) aims at improving the generalization performance by utilizing the intrinsic relationships among multiple related tasks. A key assumption in most MTL algorithms is that all tasks are related, which, however, may not be the case in many real-world applications. In this paper, we propose a robust multi-task learning (RMTL) algorithm which learns multiple tasks simultaneously as well as identifies the irrelevant (outlier) tasks. Specifically, the proposed RMTL algorithm captures the task relationships using a low-rank structure, and simultaneously identifies the outlier tasks using a group-sparse structure. The proposed RMTL algorithm is formulated as a non-smooth convex (unconstrained) optimization problem. We propose to adopt the accelerated proximal method (APM) for solving such an optimization problem. The key component in APM is the computation of the proximal operator, which can be shown to admit an analytic solution. We also theoretically analyze the effectiveness of the RMTL algorithm. In particular, we derive a key property of the optimal solution to RMTL; moreover, based on this key property, we establish a theoretical bound for characterizing the learning performance of RMTL. Our experimental results on benchmark data sets demonstrate the effectiveness and efficiency of the proposed algorithm.

KW - Group-sparsity

KW - Low-rank patterns

KW - Multi-task learning

KW - Robust

UR - http://www.scopus.com/inward/record.url?scp=80052677096&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=80052677096&partnerID=8YFLogxK

U2 - 10.1145/2020408.2020423

DO - 10.1145/2020408.2020423

M3 - Conference contribution

AN - SCOPUS:80052677096

SN - 9781450308137

SP - 42

EP - 50

BT - Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining

ER -