Incremental Feature Selection

Huan Liu, Rudy Setiono

Research output: Contribution to journalArticle

64 Citations (Scopus)

Abstract

Feature selection is a problem of finding relevant features. When the number of features of a dataset is large and its number of patterns is huge, an effective method of feature selection can help in dimensionality reduction. An incremental probabilistic algorithm is designed and implemented as an alternative to the exhaustive and heuristic approaches. Theoretical analysis is given to support the idea of the probabilistic algorithm in finding an optimal or near-optimal subset of features. Experimental results suggest that (1) the probabilistic algorithm is effective in obtaining optimal/suboptimal feature subsets; (2) its incremental version expedites feature selection further when the number of patterns is large and can scale up without sacrificing the quality of selected features.

Original languageEnglish (US)
Pages (from-to)217-230
Number of pages14
JournalApplied Intelligence
Volume9
Issue number3
StatePublished - 1998
Externally publishedYes

Fingerprint

Feature extraction
Set theory

Keywords

  • Dimensionality reduction
  • Feature selection
  • Machine learning
  • Pattern recognition

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Artificial Intelligence

Cite this

Liu, H., & Setiono, R. (1998). Incremental Feature Selection. Applied Intelligence, 9(3), 217-230.

Incremental Feature Selection. / Liu, Huan; Setiono, Rudy.

In: Applied Intelligence, Vol. 9, No. 3, 1998, p. 217-230.

Research output: Contribution to journalArticle

Liu, H & Setiono, R 1998, 'Incremental Feature Selection', Applied Intelligence, vol. 9, no. 3, pp. 217-230.
Liu, Huan ; Setiono, Rudy. / Incremental Feature Selection. In: Applied Intelligence. 1998 ; Vol. 9, No. 3. pp. 217-230.
@article{19386bd447f74b3fbac0b8c1b1e711ff,
title = "Incremental Feature Selection",
abstract = "Feature selection is a problem of finding relevant features. When the number of features of a dataset is large and its number of patterns is huge, an effective method of feature selection can help in dimensionality reduction. An incremental probabilistic algorithm is designed and implemented as an alternative to the exhaustive and heuristic approaches. Theoretical analysis is given to support the idea of the probabilistic algorithm in finding an optimal or near-optimal subset of features. Experimental results suggest that (1) the probabilistic algorithm is effective in obtaining optimal/suboptimal feature subsets; (2) its incremental version expedites feature selection further when the number of patterns is large and can scale up without sacrificing the quality of selected features.",
keywords = "Dimensionality reduction, Feature selection, Machine learning, Pattern recognition",
author = "Huan Liu and Rudy Setiono",
year = "1998",
language = "English (US)",
volume = "9",
pages = "217--230",
journal = "Applied Intelligence",
issn = "0924-669X",
publisher = "Springer Netherlands",
number = "3",

}

TY - JOUR

T1 - Incremental Feature Selection

AU - Liu, Huan

AU - Setiono, Rudy

PY - 1998

Y1 - 1998

N2 - Feature selection is a problem of finding relevant features. When the number of features of a dataset is large and its number of patterns is huge, an effective method of feature selection can help in dimensionality reduction. An incremental probabilistic algorithm is designed and implemented as an alternative to the exhaustive and heuristic approaches. Theoretical analysis is given to support the idea of the probabilistic algorithm in finding an optimal or near-optimal subset of features. Experimental results suggest that (1) the probabilistic algorithm is effective in obtaining optimal/suboptimal feature subsets; (2) its incremental version expedites feature selection further when the number of patterns is large and can scale up without sacrificing the quality of selected features.

AB - Feature selection is a problem of finding relevant features. When the number of features of a dataset is large and its number of patterns is huge, an effective method of feature selection can help in dimensionality reduction. An incremental probabilistic algorithm is designed and implemented as an alternative to the exhaustive and heuristic approaches. Theoretical analysis is given to support the idea of the probabilistic algorithm in finding an optimal or near-optimal subset of features. Experimental results suggest that (1) the probabilistic algorithm is effective in obtaining optimal/suboptimal feature subsets; (2) its incremental version expedites feature selection further when the number of patterns is large and can scale up without sacrificing the quality of selected features.

KW - Dimensionality reduction

KW - Feature selection

KW - Machine learning

KW - Pattern recognition

UR - http://www.scopus.com/inward/record.url?scp=0032206482&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0032206482&partnerID=8YFLogxK

M3 - Article

AN - SCOPUS:0032206482

VL - 9

SP - 217

EP - 230

JO - Applied Intelligence

JF - Applied Intelligence

SN - 0924-669X

IS - 3

ER -