Stable L2-regularized ensemble feature weighting

Yun Li, Shasha Huang, Songcan Chen, Jennie Si

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)

Abstract

When selecting features for knowledge discovery applications, stability is a highly desired property. By stability of feature selection, here it means that the feature selection outcomes vary only insignificantly if the respective data change slightly. Several stable feature selection methods have been proposed, but only with empirical evaluation of the stability. In this paper, we aim at providing a try to give an analysis for the stability of our ensemble feature weighting algorithm. As an example, a feature weighting method based on L2-regularized logistic loss and its ensembles using linear aggregation is introduced. Moreover, the detailed analysis for uniform stability and rotation invariance of the ensemble feature weighting method is presented. Additionally, some experiments were conducted using real-world microarray data sets. Results show that the proposed ensemble feature weighting methods preserved stability property while performing satisfactory classification. In most cases, at least one of them actually provided better or similar tradeoff between stability and classification when compared with other methods designed for boosting the stability.

Original languageEnglish (US)
Title of host publicationLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Pages167-178
Number of pages12
Volume7872 LNCS
DOIs
StatePublished - 2013
Event11th International Workshop on Multiple Classifier Systems, MCS 2013 - Nanjing, China
Duration: May 15 2013May 17 2013

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume7872 LNCS
ISSN (Print)03029743
ISSN (Electronic)16113349

Other

Other11th International Workshop on Multiple Classifier Systems, MCS 2013
CountryChina
CityNanjing
Period5/15/135/17/13

Fingerprint

Feature Weighting
Ensemble
Feature Selection
Feature extraction
Rotation Invariance
Uniform Stability
Knowledge Discovery
Boosting
Microarray Data
Logistics
Microarrays
Invariance
Aggregation
Data mining
Trade-offs
Vary
Agglomeration
Evaluation

ASJC Scopus subject areas

  • Computer Science(all)
  • Theoretical Computer Science

Cite this

Li, Y., Huang, S., Chen, S., & Si, J. (2013). Stable L2-regularized ensemble feature weighting. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 7872 LNCS, pp. 167-178). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 7872 LNCS). https://doi.org/10.1007/978-3-642-38067-9_15

Stable L2-regularized ensemble feature weighting. / Li, Yun; Huang, Shasha; Chen, Songcan; Si, Jennie.

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Vol. 7872 LNCS 2013. p. 167-178 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 7872 LNCS).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Li, Y, Huang, S, Chen, S & Si, J 2013, Stable L2-regularized ensemble feature weighting. in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). vol. 7872 LNCS, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 7872 LNCS, pp. 167-178, 11th International Workshop on Multiple Classifier Systems, MCS 2013, Nanjing, China, 5/15/13. https://doi.org/10.1007/978-3-642-38067-9_15
Li Y, Huang S, Chen S, Si J. Stable L2-regularized ensemble feature weighting. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Vol. 7872 LNCS. 2013. p. 167-178. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)). https://doi.org/10.1007/978-3-642-38067-9_15
Li, Yun ; Huang, Shasha ; Chen, Songcan ; Si, Jennie. / Stable L2-regularized ensemble feature weighting. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Vol. 7872 LNCS 2013. pp. 167-178 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).
@inproceedings{e4fde7d614e74f5e8b39e9ed58419644,
title = "Stable L2-regularized ensemble feature weighting",
abstract = "When selecting features for knowledge discovery applications, stability is a highly desired property. By stability of feature selection, here it means that the feature selection outcomes vary only insignificantly if the respective data change slightly. Several stable feature selection methods have been proposed, but only with empirical evaluation of the stability. In this paper, we aim at providing a try to give an analysis for the stability of our ensemble feature weighting algorithm. As an example, a feature weighting method based on L2-regularized logistic loss and its ensembles using linear aggregation is introduced. Moreover, the detailed analysis for uniform stability and rotation invariance of the ensemble feature weighting method is presented. Additionally, some experiments were conducted using real-world microarray data sets. Results show that the proposed ensemble feature weighting methods preserved stability property while performing satisfactory classification. In most cases, at least one of them actually provided better or similar tradeoff between stability and classification when compared with other methods designed for boosting the stability.",
author = "Yun Li and Shasha Huang and Songcan Chen and Jennie Si",
year = "2013",
doi = "10.1007/978-3-642-38067-9_15",
language = "English (US)",
isbn = "9783642380662",
volume = "7872 LNCS",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
pages = "167--178",
booktitle = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",

}

TY - GEN

T1 - Stable L2-regularized ensemble feature weighting

AU - Li, Yun

AU - Huang, Shasha

AU - Chen, Songcan

AU - Si, Jennie

PY - 2013

Y1 - 2013

N2 - When selecting features for knowledge discovery applications, stability is a highly desired property. By stability of feature selection, here it means that the feature selection outcomes vary only insignificantly if the respective data change slightly. Several stable feature selection methods have been proposed, but only with empirical evaluation of the stability. In this paper, we aim at providing a try to give an analysis for the stability of our ensemble feature weighting algorithm. As an example, a feature weighting method based on L2-regularized logistic loss and its ensembles using linear aggregation is introduced. Moreover, the detailed analysis for uniform stability and rotation invariance of the ensemble feature weighting method is presented. Additionally, some experiments were conducted using real-world microarray data sets. Results show that the proposed ensemble feature weighting methods preserved stability property while performing satisfactory classification. In most cases, at least one of them actually provided better or similar tradeoff between stability and classification when compared with other methods designed for boosting the stability.

AB - When selecting features for knowledge discovery applications, stability is a highly desired property. By stability of feature selection, here it means that the feature selection outcomes vary only insignificantly if the respective data change slightly. Several stable feature selection methods have been proposed, but only with empirical evaluation of the stability. In this paper, we aim at providing a try to give an analysis for the stability of our ensemble feature weighting algorithm. As an example, a feature weighting method based on L2-regularized logistic loss and its ensembles using linear aggregation is introduced. Moreover, the detailed analysis for uniform stability and rotation invariance of the ensemble feature weighting method is presented. Additionally, some experiments were conducted using real-world microarray data sets. Results show that the proposed ensemble feature weighting methods preserved stability property while performing satisfactory classification. In most cases, at least one of them actually provided better or similar tradeoff between stability and classification when compared with other methods designed for boosting the stability.

UR - http://www.scopus.com/inward/record.url?scp=84892907071&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84892907071&partnerID=8YFLogxK

U2 - 10.1007/978-3-642-38067-9_15

DO - 10.1007/978-3-642-38067-9_15

M3 - Conference contribution

AN - SCOPUS:84892907071

SN - 9783642380662

VL - 7872 LNCS

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 167

EP - 178

BT - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

ER -