Theory and application of the supervised learning method based on gradient algorithms Part (2). Training mechanism

Jennie Si, Guian Zhou, Han Li, Yingdou Han

Research output: Contribution to journalArticle

4 Citations (Scopus)

Abstract

Based on gradient algorithm and the fundamental approximation of the feedforward network, a new supervised comprehensive training mechanism is put forward. In the realization of learning process, the training mechanism can choose the appropriate parameter set to avoid overfitting and achieve required accuracy with reduced calculation and storage complexity as well as satisfactory validity and generality. The new algorithm (PTNT) can incorporate several aspects of the neural network training in the same process with lessened training process and improved accuracy over the BP algorithm and inherited algorithm. PTNT algorithm converges like LM algorithm, with a storage complexity far less than half of the latter. Simulation results justified the generality of the supervised training mechanism and feedforward network.

Original languageEnglish (US)
Pages (from-to)104-107, 117
JournalQinghua Daxue Xuebao/Journal of Tsinghua University
Volume37
Issue number9
StatePublished - 1997
Externally publishedYes

Fingerprint

Gradient Algorithm
Supervised learning
Supervised Learning
Feedforward Networks
BP Algorithm
Overfitting
Learning Process
Choose
Training
Neural Networks
Converge
Neural networks
Approximation
Simulation

ASJC Scopus subject areas

  • Engineering(all)

Cite this

Theory and application of the supervised learning method based on gradient algorithms Part (2). Training mechanism. / Si, Jennie; Zhou, Guian; Li, Han; Han, Yingdou.

In: Qinghua Daxue Xuebao/Journal of Tsinghua University, Vol. 37, No. 9, 1997, p. 104-107, 117.

Research output: Contribution to journalArticle

@article{8430dbfbf1964b3f951a91f6a96af6f2,
title = "Theory and application of the supervised learning method based on gradient algorithms Part (2). Training mechanism",
abstract = "Based on gradient algorithm and the fundamental approximation of the feedforward network, a new supervised comprehensive training mechanism is put forward. In the realization of learning process, the training mechanism can choose the appropriate parameter set to avoid overfitting and achieve required accuracy with reduced calculation and storage complexity as well as satisfactory validity and generality. The new algorithm (PTNT) can incorporate several aspects of the neural network training in the same process with lessened training process and improved accuracy over the BP algorithm and inherited algorithm. PTNT algorithm converges like LM algorithm, with a storage complexity far less than half of the latter. Simulation results justified the generality of the supervised training mechanism and feedforward network.",
author = "Jennie Si and Guian Zhou and Han Li and Yingdou Han",
year = "1997",
language = "English (US)",
volume = "37",
pages = "104--107, 117",
journal = "Qinghua Daxue Xuebao/Journal of Tsinghua University",
issn = "1000-0054",
publisher = "Press of Tsinghua University",
number = "9",

}

TY - JOUR

T1 - Theory and application of the supervised learning method based on gradient algorithms Part (2). Training mechanism

AU - Si, Jennie

AU - Zhou, Guian

AU - Li, Han

AU - Han, Yingdou

PY - 1997

Y1 - 1997

N2 - Based on gradient algorithm and the fundamental approximation of the feedforward network, a new supervised comprehensive training mechanism is put forward. In the realization of learning process, the training mechanism can choose the appropriate parameter set to avoid overfitting and achieve required accuracy with reduced calculation and storage complexity as well as satisfactory validity and generality. The new algorithm (PTNT) can incorporate several aspects of the neural network training in the same process with lessened training process and improved accuracy over the BP algorithm and inherited algorithm. PTNT algorithm converges like LM algorithm, with a storage complexity far less than half of the latter. Simulation results justified the generality of the supervised training mechanism and feedforward network.

AB - Based on gradient algorithm and the fundamental approximation of the feedforward network, a new supervised comprehensive training mechanism is put forward. In the realization of learning process, the training mechanism can choose the appropriate parameter set to avoid overfitting and achieve required accuracy with reduced calculation and storage complexity as well as satisfactory validity and generality. The new algorithm (PTNT) can incorporate several aspects of the neural network training in the same process with lessened training process and improved accuracy over the BP algorithm and inherited algorithm. PTNT algorithm converges like LM algorithm, with a storage complexity far less than half of the latter. Simulation results justified the generality of the supervised training mechanism and feedforward network.

UR - http://www.scopus.com/inward/record.url?scp=0031348584&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0031348584&partnerID=8YFLogxK

M3 - Article

AN - SCOPUS:0031348584

VL - 37

SP - 104-107, 117

JO - Qinghua Daxue Xuebao/Journal of Tsinghua University

JF - Qinghua Daxue Xuebao/Journal of Tsinghua University

SN - 1000-0054

IS - 9

ER -