Low-rank and sparse multi-task learning

Jianhui Chen, Jiayu Zhou, Jieping Ye

Research output: Chapter in Book/Report/Conference proceedingChapter

2 Scopus citations

Abstract

Multi-task learning (MTL) aims to improve the overall generalization performance by learning multiple related tasks simultaneously. Specifically, MTL exploits the intrinsic task relatedness, based on which the informative domain knowledge from each task can be shared across multiple tasks and thus facilitate the individual task learning. Modeling the relationship of multiple tasks is critical to the practical performance of MTL. We propose to correlate multiple tasks using a lowrank representation and formulate our MTL approaches as mathematical optimization problems of minimizing the empirical loss regularized by the aforementioned lowrank structure and a separate sparse structure. For the proposed MTL approaches, we develop gradient based optimization algorithms to efficiently find their globally optimal solutions. We also conduct theoretical analysis on our MTL approaches, i.e., deriving performance bounds to evaluate how well the integration of low-rank and sparse representations can estimate multiple related tasks.

Original languageEnglish (US)
Title of host publicationLow-Rank and Sparse Modeling for Visual Analysis
PublisherSpringer International Publishing
Pages151-180
Number of pages30
ISBN (Electronic)9783319120003
ISBN (Print)9783319119991
DOIs
StatePublished - Jan 1 2014

Keywords

  • Low-rank
  • Multi-task learning
  • Optimization algorithms
  • Sparsity
  • Structure regularization

ASJC Scopus subject areas

  • General Computer Science

Fingerprint

Dive into the research topics of 'Low-rank and sparse multi-task learning'. Together they form a unique fingerprint.

Cite this