Abstract
In this paper we propose and analyze nonlinear least squares methods, which process the data incrementally, one data block at a time. Such methods are well suited for large data sets and real time operation, and have received much attention in the context of neural network training problems. We focus on the Extended Kalman Filter, which may be viewed as an incremental version of the Gauss-Newton method. We provide a nonstochastic analysis of its convergence properties, and we discuss variants aimed at accelerating its convergence.
Original language | English (US) |
---|---|
Pages (from-to) | 1211-1214 |
Number of pages | 4 |
Journal | Proceedings of the IEEE Conference on Decision and Control |
Volume | 2 |
State | Published - 1994 |
Externally published | Yes |
Event | Proceedings of the 33rd IEEE Conference on Decision and Control. Part 1 (of 4) - Lake Buena Vista, FL, USA Duration: Dec 14 1994 → Dec 16 1994 |
ASJC Scopus subject areas
- Control and Systems Engineering
- Modeling and Simulation
- Control and Optimization