Supervised low rank matrix approximation for stable feature selection

Salem Alelyani, Huan Liu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Increasing attention has been focused on the stability of selected features or selection stability, which is becoming a new measure in determining the effectiveness of a feature selection algorithm besides the learning performance. A recent study has shown that data characteristics play a significant role in selection stability. Hence, the solution to selection instability should begin with data. In this work, we propose a novel framework with a noise-reduction step before feature selection. Noise reduction is achieved via well-known low rank matrix approximation techniques (namely SVD and NMF) in a supervised manner to reduce data noise and variance between samples from the same class. The new framework is empirically shown to be highly effective with real high-dimensional datasets improving both selection stability and the precision of selecting relevant features while maintaining the classification accuracy for various feature selection methods.

Original languageEnglish (US)
Title of host publicationProceedings - 2012 11th International Conference on Machine Learning and Applications, ICMLA 2012
Pages324-329
Number of pages6
Volume1
DOIs
StatePublished - 2012
Event11th IEEE International Conference on Machine Learning and Applications, ICMLA 2012 - Boca Raton, FL, United States
Duration: Dec 12 2012Dec 15 2012

Other

Other11th IEEE International Conference on Machine Learning and Applications, ICMLA 2012
CountryUnited States
CityBoca Raton, FL
Period12/12/1212/15/12

Fingerprint

Feature extraction
Noise abatement
Singular value decomposition
learning performance

Keywords

  • Low Rank Approximation
  • Noise Re-duction
  • selection algorithms
  • stability

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Education

Cite this

Alelyani, S., & Liu, H. (2012). Supervised low rank matrix approximation for stable feature selection. In Proceedings - 2012 11th International Conference on Machine Learning and Applications, ICMLA 2012 (Vol. 1, pp. 324-329). [6406683] https://doi.org/10.1109/ICMLA.2012.61

Supervised low rank matrix approximation for stable feature selection. / Alelyani, Salem; Liu, Huan.

Proceedings - 2012 11th International Conference on Machine Learning and Applications, ICMLA 2012. Vol. 1 2012. p. 324-329 6406683.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Alelyani, S & Liu, H 2012, Supervised low rank matrix approximation for stable feature selection. in Proceedings - 2012 11th International Conference on Machine Learning and Applications, ICMLA 2012. vol. 1, 6406683, pp. 324-329, 11th IEEE International Conference on Machine Learning and Applications, ICMLA 2012, Boca Raton, FL, United States, 12/12/12. https://doi.org/10.1109/ICMLA.2012.61
Alelyani S, Liu H. Supervised low rank matrix approximation for stable feature selection. In Proceedings - 2012 11th International Conference on Machine Learning and Applications, ICMLA 2012. Vol. 1. 2012. p. 324-329. 6406683 https://doi.org/10.1109/ICMLA.2012.61
Alelyani, Salem ; Liu, Huan. / Supervised low rank matrix approximation for stable feature selection. Proceedings - 2012 11th International Conference on Machine Learning and Applications, ICMLA 2012. Vol. 1 2012. pp. 324-329
@inproceedings{e5538dfecc6b44ecb14a61e40d07821d,
title = "Supervised low rank matrix approximation for stable feature selection",
abstract = "Increasing attention has been focused on the stability of selected features or selection stability, which is becoming a new measure in determining the effectiveness of a feature selection algorithm besides the learning performance. A recent study has shown that data characteristics play a significant role in selection stability. Hence, the solution to selection instability should begin with data. In this work, we propose a novel framework with a noise-reduction step before feature selection. Noise reduction is achieved via well-known low rank matrix approximation techniques (namely SVD and NMF) in a supervised manner to reduce data noise and variance between samples from the same class. The new framework is empirically shown to be highly effective with real high-dimensional datasets improving both selection stability and the precision of selecting relevant features while maintaining the classification accuracy for various feature selection methods.",
keywords = "Low Rank Approximation, Noise Re-duction, selection algorithms, stability",
author = "Salem Alelyani and Huan Liu",
year = "2012",
doi = "10.1109/ICMLA.2012.61",
language = "English (US)",
isbn = "9780769549132",
volume = "1",
pages = "324--329",
booktitle = "Proceedings - 2012 11th International Conference on Machine Learning and Applications, ICMLA 2012",

}

TY - GEN

T1 - Supervised low rank matrix approximation for stable feature selection

AU - Alelyani, Salem

AU - Liu, Huan

PY - 2012

Y1 - 2012

N2 - Increasing attention has been focused on the stability of selected features or selection stability, which is becoming a new measure in determining the effectiveness of a feature selection algorithm besides the learning performance. A recent study has shown that data characteristics play a significant role in selection stability. Hence, the solution to selection instability should begin with data. In this work, we propose a novel framework with a noise-reduction step before feature selection. Noise reduction is achieved via well-known low rank matrix approximation techniques (namely SVD and NMF) in a supervised manner to reduce data noise and variance between samples from the same class. The new framework is empirically shown to be highly effective with real high-dimensional datasets improving both selection stability and the precision of selecting relevant features while maintaining the classification accuracy for various feature selection methods.

AB - Increasing attention has been focused on the stability of selected features or selection stability, which is becoming a new measure in determining the effectiveness of a feature selection algorithm besides the learning performance. A recent study has shown that data characteristics play a significant role in selection stability. Hence, the solution to selection instability should begin with data. In this work, we propose a novel framework with a noise-reduction step before feature selection. Noise reduction is achieved via well-known low rank matrix approximation techniques (namely SVD and NMF) in a supervised manner to reduce data noise and variance between samples from the same class. The new framework is empirically shown to be highly effective with real high-dimensional datasets improving both selection stability and the precision of selecting relevant features while maintaining the classification accuracy for various feature selection methods.

KW - Low Rank Approximation

KW - Noise Re-duction

KW - selection algorithms

KW - stability

UR - http://www.scopus.com/inward/record.url?scp=84873583481&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84873583481&partnerID=8YFLogxK

U2 - 10.1109/ICMLA.2012.61

DO - 10.1109/ICMLA.2012.61

M3 - Conference contribution

AN - SCOPUS:84873583481

SN - 9780769549132

VL - 1

SP - 324

EP - 329

BT - Proceedings - 2012 11th International Conference on Machine Learning and Applications, ICMLA 2012

ER -