Low-rank and sparse multi-task learning

Jianhui Chen, Jiayu Zhou, Jieping Ye

Research output: Chapter in Book/Report/Conference proceedingChapter

2 Citations (Scopus)

Abstract

Multi-task learning (MTL) aims to improve the overall generalization performance by learning multiple related tasks simultaneously. Specifically, MTL exploits the intrinsic task relatedness, based on which the informative domain knowledge from each task can be shared across multiple tasks and thus facilitate the individual task learning. Modeling the relationship of multiple tasks is critical to the practical performance of MTL. We propose to correlate multiple tasks using a lowrank representation and formulate our MTL approaches as mathematical optimization problems of minimizing the empirical loss regularized by the aforementioned lowrank structure and a separate sparse structure. For the proposed MTL approaches, we develop gradient based optimization algorithms to efficiently find their globally optimal solutions. We also conduct theoretical analysis on our MTL approaches, i.e., deriving performance bounds to evaluate how well the integration of low-rank and sparse representations can estimate multiple related tasks.

Original languageEnglish (US)
Title of host publicationLow-Rank and Sparse Modeling for Visual Analysis
PublisherSpringer International Publishing
Pages151-180
Number of pages30
ISBN (Print)9783319120003, 9783319119991
DOIs
StatePublished - Jan 1 2014

Keywords

  • Low-rank
  • Multi-task learning
  • Optimization algorithms
  • Sparsity
  • Structure regularization

ASJC Scopus subject areas

  • Computer Science(all)

Cite this

Chen, J., Zhou, J., & Ye, J. (2014). Low-rank and sparse multi-task learning. In Low-Rank and Sparse Modeling for Visual Analysis (pp. 151-180). Springer International Publishing. https://doi.org/10.1007/978-3-319-12000-3_8

Low-rank and sparse multi-task learning. / Chen, Jianhui; Zhou, Jiayu; Ye, Jieping.

Low-Rank and Sparse Modeling for Visual Analysis. Springer International Publishing, 2014. p. 151-180.

Research output: Chapter in Book/Report/Conference proceedingChapter

Chen, J, Zhou, J & Ye, J 2014, Low-rank and sparse multi-task learning. in Low-Rank and Sparse Modeling for Visual Analysis. Springer International Publishing, pp. 151-180. https://doi.org/10.1007/978-3-319-12000-3_8
Chen J, Zhou J, Ye J. Low-rank and sparse multi-task learning. In Low-Rank and Sparse Modeling for Visual Analysis. Springer International Publishing. 2014. p. 151-180 https://doi.org/10.1007/978-3-319-12000-3_8
Chen, Jianhui ; Zhou, Jiayu ; Ye, Jieping. / Low-rank and sparse multi-task learning. Low-Rank and Sparse Modeling for Visual Analysis. Springer International Publishing, 2014. pp. 151-180
@inbook{e858c7de079a4e99ba7f2e25905fe4dd,
title = "Low-rank and sparse multi-task learning",
abstract = "Multi-task learning (MTL) aims to improve the overall generalization performance by learning multiple related tasks simultaneously. Specifically, MTL exploits the intrinsic task relatedness, based on which the informative domain knowledge from each task can be shared across multiple tasks and thus facilitate the individual task learning. Modeling the relationship of multiple tasks is critical to the practical performance of MTL. We propose to correlate multiple tasks using a lowrank representation and formulate our MTL approaches as mathematical optimization problems of minimizing the empirical loss regularized by the aforementioned lowrank structure and a separate sparse structure. For the proposed MTL approaches, we develop gradient based optimization algorithms to efficiently find their globally optimal solutions. We also conduct theoretical analysis on our MTL approaches, i.e., deriving performance bounds to evaluate how well the integration of low-rank and sparse representations can estimate multiple related tasks.",
keywords = "Low-rank, Multi-task learning, Optimization algorithms, Sparsity, Structure regularization",
author = "Jianhui Chen and Jiayu Zhou and Jieping Ye",
year = "2014",
month = "1",
day = "1",
doi = "10.1007/978-3-319-12000-3_8",
language = "English (US)",
isbn = "9783319120003",
pages = "151--180",
booktitle = "Low-Rank and Sparse Modeling for Visual Analysis",
publisher = "Springer International Publishing",

}

TY - CHAP

T1 - Low-rank and sparse multi-task learning

AU - Chen, Jianhui

AU - Zhou, Jiayu

AU - Ye, Jieping

PY - 2014/1/1

Y1 - 2014/1/1

N2 - Multi-task learning (MTL) aims to improve the overall generalization performance by learning multiple related tasks simultaneously. Specifically, MTL exploits the intrinsic task relatedness, based on which the informative domain knowledge from each task can be shared across multiple tasks and thus facilitate the individual task learning. Modeling the relationship of multiple tasks is critical to the practical performance of MTL. We propose to correlate multiple tasks using a lowrank representation and formulate our MTL approaches as mathematical optimization problems of minimizing the empirical loss regularized by the aforementioned lowrank structure and a separate sparse structure. For the proposed MTL approaches, we develop gradient based optimization algorithms to efficiently find their globally optimal solutions. We also conduct theoretical analysis on our MTL approaches, i.e., deriving performance bounds to evaluate how well the integration of low-rank and sparse representations can estimate multiple related tasks.

AB - Multi-task learning (MTL) aims to improve the overall generalization performance by learning multiple related tasks simultaneously. Specifically, MTL exploits the intrinsic task relatedness, based on which the informative domain knowledge from each task can be shared across multiple tasks and thus facilitate the individual task learning. Modeling the relationship of multiple tasks is critical to the practical performance of MTL. We propose to correlate multiple tasks using a lowrank representation and formulate our MTL approaches as mathematical optimization problems of minimizing the empirical loss regularized by the aforementioned lowrank structure and a separate sparse structure. For the proposed MTL approaches, we develop gradient based optimization algorithms to efficiently find their globally optimal solutions. We also conduct theoretical analysis on our MTL approaches, i.e., deriving performance bounds to evaluate how well the integration of low-rank and sparse representations can estimate multiple related tasks.

KW - Low-rank

KW - Multi-task learning

KW - Optimization algorithms

KW - Sparsity

KW - Structure regularization

UR - http://www.scopus.com/inward/record.url?scp=84955689420&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84955689420&partnerID=8YFLogxK

U2 - 10.1007/978-3-319-12000-3_8

DO - 10.1007/978-3-319-12000-3_8

M3 - Chapter

SN - 9783319120003

SN - 9783319119991

SP - 151

EP - 180

BT - Low-Rank and Sparse Modeling for Visual Analysis

PB - Springer International Publishing

ER -