Abstract

Many real-world applications require multi-way feature selection rather than single-way feature selection. Multi-way feature selection is more challenging compared to single-way feature selection due to the presence of inter-correlation among the multi-way features. To address this challenge, we propose a novel non-negative matrix tri-factorization model based on co-sparsity regularization to facilitate feature co-shrinking for co-clustering. The basic idea is to learn the inter-correlation among the multi-way features while shrinking the irrelevant ones by encouraging the co-sparsity of the model parameters. The objective is to simultaneously minimize the loss function for the matrix tri-factorization, and the co-sparsity regularization imposed on the model. Furthermore, we develop an efficient and convergence-guaranteed algorithm to solve the non-smooth optimization problem, which works in an iteratively update fashion. The experimental results on various data sets demonstrate the effectiveness of the proposed approach.

Original languageEnglish (US)
Pages (from-to)12-19
Number of pages8
JournalPattern Recognition
Volume77
DOIs
StatePublished - May 1 2018

Fingerprint

Feature extraction
Factorization

Keywords

  • Co-clustering
  • Co-feature-selection
  • Co-sparsity
  • Non-negative matrix tri-factorization

ASJC Scopus subject areas

  • Software
  • Signal Processing
  • Computer Vision and Pattern Recognition
  • Artificial Intelligence

Cite this

Feature co-shrinking for co-clustering. / Tan, Qi; Yang, Pei; He, Jingrui.

In: Pattern Recognition, Vol. 77, 01.05.2018, p. 12-19.

Research output: Contribution to journalArticle

Tan, Qi ; Yang, Pei ; He, Jingrui. / Feature co-shrinking for co-clustering. In: Pattern Recognition. 2018 ; Vol. 77. pp. 12-19.
@article{86c5e02ba04148b0a206b6cdb4673f59,
title = "Feature co-shrinking for co-clustering",
abstract = "Many real-world applications require multi-way feature selection rather than single-way feature selection. Multi-way feature selection is more challenging compared to single-way feature selection due to the presence of inter-correlation among the multi-way features. To address this challenge, we propose a novel non-negative matrix tri-factorization model based on co-sparsity regularization to facilitate feature co-shrinking for co-clustering. The basic idea is to learn the inter-correlation among the multi-way features while shrinking the irrelevant ones by encouraging the co-sparsity of the model parameters. The objective is to simultaneously minimize the loss function for the matrix tri-factorization, and the co-sparsity regularization imposed on the model. Furthermore, we develop an efficient and convergence-guaranteed algorithm to solve the non-smooth optimization problem, which works in an iteratively update fashion. The experimental results on various data sets demonstrate the effectiveness of the proposed approach.",
keywords = "Co-clustering, Co-feature-selection, Co-sparsity, Non-negative matrix tri-factorization",
author = "Qi Tan and Pei Yang and Jingrui He",
year = "2018",
month = "5",
day = "1",
doi = "10.1016/j.patcog.2017.12.005",
language = "English (US)",
volume = "77",
pages = "12--19",
journal = "Pattern Recognition",
issn = "0031-3203",
publisher = "Elsevier Limited",

}

TY - JOUR

T1 - Feature co-shrinking for co-clustering

AU - Tan, Qi

AU - Yang, Pei

AU - He, Jingrui

PY - 2018/5/1

Y1 - 2018/5/1

N2 - Many real-world applications require multi-way feature selection rather than single-way feature selection. Multi-way feature selection is more challenging compared to single-way feature selection due to the presence of inter-correlation among the multi-way features. To address this challenge, we propose a novel non-negative matrix tri-factorization model based on co-sparsity regularization to facilitate feature co-shrinking for co-clustering. The basic idea is to learn the inter-correlation among the multi-way features while shrinking the irrelevant ones by encouraging the co-sparsity of the model parameters. The objective is to simultaneously minimize the loss function for the matrix tri-factorization, and the co-sparsity regularization imposed on the model. Furthermore, we develop an efficient and convergence-guaranteed algorithm to solve the non-smooth optimization problem, which works in an iteratively update fashion. The experimental results on various data sets demonstrate the effectiveness of the proposed approach.

AB - Many real-world applications require multi-way feature selection rather than single-way feature selection. Multi-way feature selection is more challenging compared to single-way feature selection due to the presence of inter-correlation among the multi-way features. To address this challenge, we propose a novel non-negative matrix tri-factorization model based on co-sparsity regularization to facilitate feature co-shrinking for co-clustering. The basic idea is to learn the inter-correlation among the multi-way features while shrinking the irrelevant ones by encouraging the co-sparsity of the model parameters. The objective is to simultaneously minimize the loss function for the matrix tri-factorization, and the co-sparsity regularization imposed on the model. Furthermore, we develop an efficient and convergence-guaranteed algorithm to solve the non-smooth optimization problem, which works in an iteratively update fashion. The experimental results on various data sets demonstrate the effectiveness of the proposed approach.

KW - Co-clustering

KW - Co-feature-selection

KW - Co-sparsity

KW - Non-negative matrix tri-factorization

UR - http://www.scopus.com/inward/record.url?scp=85044279263&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85044279263&partnerID=8YFLogxK

U2 - 10.1016/j.patcog.2017.12.005

DO - 10.1016/j.patcog.2017.12.005

M3 - Article

VL - 77

SP - 12

EP - 19

JO - Pattern Recognition

JF - Pattern Recognition

SN - 0031-3203

ER -