Neural-network feature selector

Rudy Setiono, Huan Liu

Research output: Contribution to journalReview article

258 Scopus citations

Abstract

Feature selection is an integral part of most learning algorithms. Due to the existence of irrelevant and redundant attributes, by selecting only the relevant attributes of the data, higher predictive accuracy can be expected from a machine learning method. In this paper, we propose the use of a three-layer feedforward neural network to select those input attributes that are most useful for discriminating classes in a given set of input patterns. A network pruning algorithm is the foundation of the proposed algorithm. By adding a penalty term to the error function of the network, redundant network connections can be distinguished from those relevant ones by their small weights when the network training process has been completed. A simple criterion to remove an attribute based on the accuracy rate of the network is developed. The network is retrained after removal of an attribute, and the selection process is repeated until no attribute meets the criterion for removal. Our experimental results suggest that the proposed method works very well on a wide variety of classification problems.

Original languageEnglish (US)
Pages (from-to)654-662
Number of pages9
JournalIEEE Transactions on Neural Networks
Volume8
Issue number3
DOIs
StatePublished - Dec 1 1997
Externally publishedYes

Keywords

  • Backpropagation
  • Cross entropy
  • Feature selection
  • Feedforward neural network
  • Network pruning
  • Penalty term

ASJC Scopus subject areas

  • Software
  • Computer Science Applications
  • Computer Networks and Communications
  • Artificial Intelligence

Fingerprint Dive into the research topics of 'Neural-network feature selector'. Together they form a unique fingerprint.

  • Cite this