Advanced neural-network training algorithm with reduced complexity based on Jacobian deficiency

Guian Zhou, Jennie Si

Research output: Contribution to journalArticle

50 Citations (Scopus)

Abstract

In this paper we introduce an advanced supervised training method for neural networks. It is based on Jacobian rank deficiency and it is formulated, in some sense, in the spirit of the Gauss-Newton algorithm. The Levenberg-Marquardt algorithm, as a modified Gauss-Newton, has been used successfully in solving nonlinear least squares problems including neural-network training. It outperforms (in terms of training accuracy, convergence properties, overall training time, etc.) the basic backpropagation and its variations with variable learning rate significantly, however, with higher computation and memory complexities within each iteration. The new method developed in this paper is aiming at improving convergence properties, while reducing the memory and computation complexities in supervised training of neural networks. Extensive simulation results are provided to demonstrate the superior performance of the new algorithm over the Levenberg-Marquardt algorithm.

Original languageEnglish (US)
Pages (from-to)448-453
Number of pages6
JournalIEEE Transactions on Neural Networks
Volume9
Issue number3
DOIs
StatePublished - 1998

Fingerprint

Training Algorithm
Network Algorithms
Neural Networks
Neural networks
Levenberg-Marquardt Algorithm
Gauss-Newton
Convergence Properties
Data storage equipment
Nonlinear Least Squares Problem
Backpropagation
Learning Rate
Back Propagation
Training
Iteration
Demonstrate
Simulation

Keywords

  • Gauss-Newton method
  • Jacobian rank deficiency
  • Neural-network training
  • Subset updating
  • Trust region algorithms

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Theoretical Computer Science
  • Electrical and Electronic Engineering
  • Artificial Intelligence
  • Computational Theory and Mathematics
  • Hardware and Architecture

Cite this

Advanced neural-network training algorithm with reduced complexity based on Jacobian deficiency. / Zhou, Guian; Si, Jennie.

In: IEEE Transactions on Neural Networks, Vol. 9, No. 3, 1998, p. 448-453.

Research output: Contribution to journalArticle

@article{bf43c607ea9748e79b7934d872b56b8a,
title = "Advanced neural-network training algorithm with reduced complexity based on Jacobian deficiency",
abstract = "In this paper we introduce an advanced supervised training method for neural networks. It is based on Jacobian rank deficiency and it is formulated, in some sense, in the spirit of the Gauss-Newton algorithm. The Levenberg-Marquardt algorithm, as a modified Gauss-Newton, has been used successfully in solving nonlinear least squares problems including neural-network training. It outperforms (in terms of training accuracy, convergence properties, overall training time, etc.) the basic backpropagation and its variations with variable learning rate significantly, however, with higher computation and memory complexities within each iteration. The new method developed in this paper is aiming at improving convergence properties, while reducing the memory and computation complexities in supervised training of neural networks. Extensive simulation results are provided to demonstrate the superior performance of the new algorithm over the Levenberg-Marquardt algorithm.",
keywords = "Gauss-Newton method, Jacobian rank deficiency, Neural-network training, Subset updating, Trust region algorithms",
author = "Guian Zhou and Jennie Si",
year = "1998",
doi = "10.1109/72.668886",
language = "English (US)",
volume = "9",
pages = "448--453",
journal = "IEEE Transactions on Neural Networks and Learning Systems",
issn = "2162-237X",
publisher = "IEEE Computational Intelligence Society",
number = "3",

}

TY - JOUR

T1 - Advanced neural-network training algorithm with reduced complexity based on Jacobian deficiency

AU - Zhou, Guian

AU - Si, Jennie

PY - 1998

Y1 - 1998

N2 - In this paper we introduce an advanced supervised training method for neural networks. It is based on Jacobian rank deficiency and it is formulated, in some sense, in the spirit of the Gauss-Newton algorithm. The Levenberg-Marquardt algorithm, as a modified Gauss-Newton, has been used successfully in solving nonlinear least squares problems including neural-network training. It outperforms (in terms of training accuracy, convergence properties, overall training time, etc.) the basic backpropagation and its variations with variable learning rate significantly, however, with higher computation and memory complexities within each iteration. The new method developed in this paper is aiming at improving convergence properties, while reducing the memory and computation complexities in supervised training of neural networks. Extensive simulation results are provided to demonstrate the superior performance of the new algorithm over the Levenberg-Marquardt algorithm.

AB - In this paper we introduce an advanced supervised training method for neural networks. It is based on Jacobian rank deficiency and it is formulated, in some sense, in the spirit of the Gauss-Newton algorithm. The Levenberg-Marquardt algorithm, as a modified Gauss-Newton, has been used successfully in solving nonlinear least squares problems including neural-network training. It outperforms (in terms of training accuracy, convergence properties, overall training time, etc.) the basic backpropagation and its variations with variable learning rate significantly, however, with higher computation and memory complexities within each iteration. The new method developed in this paper is aiming at improving convergence properties, while reducing the memory and computation complexities in supervised training of neural networks. Extensive simulation results are provided to demonstrate the superior performance of the new algorithm over the Levenberg-Marquardt algorithm.

KW - Gauss-Newton method

KW - Jacobian rank deficiency

KW - Neural-network training

KW - Subset updating

KW - Trust region algorithms

UR - http://www.scopus.com/inward/record.url?scp=0032075495&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0032075495&partnerID=8YFLogxK

U2 - 10.1109/72.668886

DO - 10.1109/72.668886

M3 - Article

C2 - 18252468

AN - SCOPUS:0032075495

VL - 9

SP - 448

EP - 453

JO - IEEE Transactions on Neural Networks and Learning Systems

JF - IEEE Transactions on Neural Networks and Learning Systems

SN - 2162-237X

IS - 3

ER -