Abstract

Motivated by real applications, heterogeneous learning has emerged as an important research area, which aims to model the coexistence of multiple types of heterogeneity. In this paper, we propose a heterogeneous representation learning model with structured sparsity regularization (HERES) to learn from multiple types of heterogeneity. It aims to leverage the rich correlations (e.g., task relatedness, view consistency, and label correlation) and the prior knowledge (e.g., the soft-clustering of tasks) of heterogeneous data to improve learning performance. To this end, HERES integrates multi-task, multi-view, and multi-label learning into a principled framework based on representation learning to model the complex correlations and employs the structured sparsity to encode the prior knowledge of data. The objective is to simultaneously minimize the reconstruction loss of using the factor matrices to recover the heterogeneous data, and the structured sparsity imposed on the model. The resulting optimization problem is challenging due to the non-smoothness and non-separability of structured sparsity. We reformulate the problem by using the auxiliary function and prove that the reformulation is separable, which leads to an efficient algorithm family for solving structured sparsity penalized problems. Furthermore, we propose various HERES models based on different loss functions and subsume them into the weighted HERES, which is able to handle missing data. The experimental results in comparison with state-of-the-art methods demonstrate the effectiveness of the proposed approach.

Original languageEnglish (US)
Pages (from-to)1-24
Number of pages24
JournalKnowledge and Information Systems
DOIs
StateAccepted/In press - Aug 9 2017

Fingerprint

Labels

Keywords

  • Heterogeneous learning
  • Multi-label learning
  • Multi-task learning
  • Multi-view learning
  • Structured sparsity

ASJC Scopus subject areas

  • Software
  • Information Systems
  • Human-Computer Interaction
  • Hardware and Architecture
  • Artificial Intelligence

Cite this

Heterogeneous representation learning with separable structured sparsity regularization. / Yang, Pei; Tan, Qi; Zhu, Yada; He, Jingrui.

In: Knowledge and Information Systems, 09.08.2017, p. 1-24.

Research output: Contribution to journalArticle

@article{cf80ce4d0c7041d397aad17e420fe0dd,
title = "Heterogeneous representation learning with separable structured sparsity regularization",
abstract = "Motivated by real applications, heterogeneous learning has emerged as an important research area, which aims to model the coexistence of multiple types of heterogeneity. In this paper, we propose a heterogeneous representation learning model with structured sparsity regularization (HERES) to learn from multiple types of heterogeneity. It aims to leverage the rich correlations (e.g., task relatedness, view consistency, and label correlation) and the prior knowledge (e.g., the soft-clustering of tasks) of heterogeneous data to improve learning performance. To this end, HERES integrates multi-task, multi-view, and multi-label learning into a principled framework based on representation learning to model the complex correlations and employs the structured sparsity to encode the prior knowledge of data. The objective is to simultaneously minimize the reconstruction loss of using the factor matrices to recover the heterogeneous data, and the structured sparsity imposed on the model. The resulting optimization problem is challenging due to the non-smoothness and non-separability of structured sparsity. We reformulate the problem by using the auxiliary function and prove that the reformulation is separable, which leads to an efficient algorithm family for solving structured sparsity penalized problems. Furthermore, we propose various HERES models based on different loss functions and subsume them into the weighted HERES, which is able to handle missing data. The experimental results in comparison with state-of-the-art methods demonstrate the effectiveness of the proposed approach.",
keywords = "Heterogeneous learning, Multi-label learning, Multi-task learning, Multi-view learning, Structured sparsity",
author = "Pei Yang and Qi Tan and Yada Zhu and Jingrui He",
year = "2017",
month = "8",
day = "9",
doi = "10.1007/s10115-017-1094-5",
language = "English (US)",
pages = "1--24",
journal = "Knowledge and Information Systems",
issn = "0219-1377",
publisher = "Springer London",

}

TY - JOUR

T1 - Heterogeneous representation learning with separable structured sparsity regularization

AU - Yang, Pei

AU - Tan, Qi

AU - Zhu, Yada

AU - He, Jingrui

PY - 2017/8/9

Y1 - 2017/8/9

N2 - Motivated by real applications, heterogeneous learning has emerged as an important research area, which aims to model the coexistence of multiple types of heterogeneity. In this paper, we propose a heterogeneous representation learning model with structured sparsity regularization (HERES) to learn from multiple types of heterogeneity. It aims to leverage the rich correlations (e.g., task relatedness, view consistency, and label correlation) and the prior knowledge (e.g., the soft-clustering of tasks) of heterogeneous data to improve learning performance. To this end, HERES integrates multi-task, multi-view, and multi-label learning into a principled framework based on representation learning to model the complex correlations and employs the structured sparsity to encode the prior knowledge of data. The objective is to simultaneously minimize the reconstruction loss of using the factor matrices to recover the heterogeneous data, and the structured sparsity imposed on the model. The resulting optimization problem is challenging due to the non-smoothness and non-separability of structured sparsity. We reformulate the problem by using the auxiliary function and prove that the reformulation is separable, which leads to an efficient algorithm family for solving structured sparsity penalized problems. Furthermore, we propose various HERES models based on different loss functions and subsume them into the weighted HERES, which is able to handle missing data. The experimental results in comparison with state-of-the-art methods demonstrate the effectiveness of the proposed approach.

AB - Motivated by real applications, heterogeneous learning has emerged as an important research area, which aims to model the coexistence of multiple types of heterogeneity. In this paper, we propose a heterogeneous representation learning model with structured sparsity regularization (HERES) to learn from multiple types of heterogeneity. It aims to leverage the rich correlations (e.g., task relatedness, view consistency, and label correlation) and the prior knowledge (e.g., the soft-clustering of tasks) of heterogeneous data to improve learning performance. To this end, HERES integrates multi-task, multi-view, and multi-label learning into a principled framework based on representation learning to model the complex correlations and employs the structured sparsity to encode the prior knowledge of data. The objective is to simultaneously minimize the reconstruction loss of using the factor matrices to recover the heterogeneous data, and the structured sparsity imposed on the model. The resulting optimization problem is challenging due to the non-smoothness and non-separability of structured sparsity. We reformulate the problem by using the auxiliary function and prove that the reformulation is separable, which leads to an efficient algorithm family for solving structured sparsity penalized problems. Furthermore, we propose various HERES models based on different loss functions and subsume them into the weighted HERES, which is able to handle missing data. The experimental results in comparison with state-of-the-art methods demonstrate the effectiveness of the proposed approach.

KW - Heterogeneous learning

KW - Multi-label learning

KW - Multi-task learning

KW - Multi-view learning

KW - Structured sparsity

UR - http://www.scopus.com/inward/record.url?scp=85027107812&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85027107812&partnerID=8YFLogxK

U2 - 10.1007/s10115-017-1094-5

DO - 10.1007/s10115-017-1094-5

M3 - Article

AN - SCOPUS:85027107812

SP - 1

EP - 24

JO - Knowledge and Information Systems

JF - Knowledge and Information Systems

SN - 0219-1377

ER -