Improving backpropagation learning with feature selection

Rudy Setiono, Huan Liu

Research output: Contribution to journalArticle

18 Scopus citations

Abstract

There exist redundant, irrelevant and noisy data. Using proper data to train a network can speed up training, simplify the learned structure, and improve its performance. A two-phase training algorithm is proposed. In the first phase, the number of input units of the network is determined by using an information base method. Only those attributes that meet certain criteria for inclusion will be considered as the input to the network. In the second phase, the number of hidden units of the network is selected automatically based on the performance of the network on the training data. One hidden unit is added at a time only if it is necessary. The experimental results show that this new algorithm can achieve a faster learning time, a simpler network and an improved performance.

Original languageEnglish (US)
Pages (from-to)129-139
Number of pages11
JournalApplied Intelligence
Volume6
Issue number2
DOIs
StatePublished - Jan 1 1996
Externally publishedYes

    Fingerprint

Keywords

  • Backpropagation
  • Feature selection
  • Feedforward neural network
  • Information theory

ASJC Scopus subject areas

  • Artificial Intelligence

Cite this