TY - JOUR
T1 - Theory and applications for the supervised learning method based on gradient algorithms Part I-Fundamental algorithm
AU - Si, Jie
AU - Zhou, Guian
AU - Li, Han
AU - Han, Yingduo
N1 - Copyright:
Copyright 2004 Elsevier Science B.V., Amsterdam. All rights reserved.
PY - 1997
Y1 - 1997
N2 - A family of gradient based on supervised training algorithm and its application is discussed. A Levenberg-Marquardt (LM) algorithm, a combination of the gradient method and the Gauss-Newton method, is under close investigation. It boasts of the local convergence of the Gauss-Newton method as well as the overall convergence of the gradient method. When μ is large, the LM algorithm looks like the gradient method; when μ is small, it changes into the Gauss-Newton method. With the aid of the approximate second derivative, the LM algorithm is more efficient than the gradient method. As to with the training process and accuracy, the LM method is superior to the conjugate gradient method and the vary-learning-rate BP method. As the main matrix is positively definite, the solution always exists. From this view point, the LM method is preferable to the Gauss-Newton method.
AB - A family of gradient based on supervised training algorithm and its application is discussed. A Levenberg-Marquardt (LM) algorithm, a combination of the gradient method and the Gauss-Newton method, is under close investigation. It boasts of the local convergence of the Gauss-Newton method as well as the overall convergence of the gradient method. When μ is large, the LM algorithm looks like the gradient method; when μ is small, it changes into the Gauss-Newton method. With the aid of the approximate second derivative, the LM algorithm is more efficient than the gradient method. As to with the training process and accuracy, the LM method is superior to the conjugate gradient method and the vary-learning-rate BP method. As the main matrix is positively definite, the solution always exists. From this view point, the LM method is preferable to the Gauss-Newton method.
UR - http://www.scopus.com/inward/record.url?scp=0031387605&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=0031387605&partnerID=8YFLogxK
M3 - Article
AN - SCOPUS:0031387605
VL - 37
SP - 71
EP - 73
JO - Qinghua Daxue Xuebao/Journal of Tsinghua University
JF - Qinghua Daxue Xuebao/Journal of Tsinghua University
SN - 1000-0054
IS - 7
ER -