Based on gradient algorithm and the fundamental approximation of the feedforward network, a new supervised comprehensive training mechanism is put forward. In the realization of learning process, the training mechanism can choose the appropriate parameter set to avoid overfitting and achieve required accuracy with reduced calculation and storage complexity as well as satisfactory validity and generality. The new algorithm (PTNT) can incorporate several aspects of the neural network training in the same process with lessened training process and improved accuracy over the BP algorithm and inherited algorithm. PTNT algorithm converges like LM algorithm, with a storage complexity far less than half of the latter. Simulation results justified the generality of the supervised training mechanism and feedforward network.
|Original language||English (US)|
|Pages (from-to)||104-107, 117|
|Journal||Qinghua Daxue Xuebao/Journal of Tsinghua University|
|State||Published - Dec 1 1997|
ASJC Scopus subject areas
- Computer Science Applications
- Applied Mathematics