Abstract
Multi-task learning (MTL) aims to improve the overall generalization performance by learning multiple related tasks simultaneously. Specifically, MTL exploits the intrinsic task relatedness, based on which the informative domain knowledge from each task can be shared across multiple tasks and thus facilitate the individual task learning. Modeling the relationship of multiple tasks is critical to the practical performance of MTL. We propose to correlate multiple tasks using a lowrank representation and formulate our MTL approaches as mathematical optimization problems of minimizing the empirical loss regularized by the aforementioned lowrank structure and a separate sparse structure. For the proposed MTL approaches, we develop gradient based optimization algorithms to efficiently find their globally optimal solutions. We also conduct theoretical analysis on our MTL approaches, i.e., deriving performance bounds to evaluate how well the integration of low-rank and sparse representations can estimate multiple related tasks.
Original language | English (US) |
---|---|
Title of host publication | Low-Rank and Sparse Modeling for Visual Analysis |
Publisher | Springer International Publishing |
Pages | 151-180 |
Number of pages | 30 |
ISBN (Print) | 9783319120003, 9783319119991 |
DOIs | |
State | Published - Jan 1 2014 |
Keywords
- Low-rank
- Multi-task learning
- Optimization algorithms
- Sparsity
- Structure regularization
ASJC Scopus subject areas
- Computer Science(all)