A convex formulation for learning shared structures from multiple tasks

Jianhui Chen, Lei Tang, Jun Liu, Jieping Ye

Research output: Chapter in Book/Report/Conference proceedingConference contribution

113 Scopus citations

Abstract

Multi-task learning (MTL) aims to improve generalization performance by learning multiple related tasks simultaneously. In this paper, we consider the problem of learning shared structures from multiple related tasks. We present an improved formulation (iASO) for multi-task learning based on the non-convex alternating structure optimization (ASO) algorithm, in which all tasks are related by a shared feature representation. We convert iASO, a non-convex formulation, into a relaxed convex one, which is, however, not scalable to large data sets due to its complex constraints. We propose an alternating optimization (cASO) algorithm which solves the convex relaxation efficiently, and further show that cASO converges to a global optimum. In addition, we present a theoretical condition, under which cASO can find a globally optimal solution to iASO. Experiments on several benchmark data sets confirm our theoretical analysis.

Original languageEnglish (US)
Title of host publicationProceedings of the 26th International Conference On Machine Learning, ICML 2009
Pages137-144
Number of pages8
StatePublished - 2009
Event26th International Conference On Machine Learning, ICML 2009 - Montreal, QC, Canada
Duration: Jun 14 2009Jun 18 2009

Publication series

NameProceedings of the 26th International Conference On Machine Learning, ICML 2009

Other

Other26th International Conference On Machine Learning, ICML 2009
Country/TerritoryCanada
CityMontreal, QC
Period6/14/096/18/09

ASJC Scopus subject areas

  • Artificial Intelligence
  • Computer Networks and Communications
  • Software

Fingerprint

Dive into the research topics of 'A convex formulation for learning shared structures from multiple tasks'. Together they form a unique fingerprint.

Cite this