Abstract
A large number of methods have been developed for estimating time-series regression parameters. Students and practitioners have a difficult time understanding what these various methods are, let alone picking the most appropriate one for their application. This article explains how these methods are related. A chronology for the development of the various methods is presented, followed by a logical characterization of the methods. An examination of current computational techniques and computing power leads to the conclusion that exact maximum likelihood estimators should be used in almost all cases where regression models have autoregressive, moving average, or mixed autoregressive-moving average error structures.
Original language | English (US) |
---|---|
Pages (from-to) | 342-348 |
Number of pages | 7 |
Journal | American Statistician |
Volume | 53 |
Issue number | 4 |
DOIs | |
State | Published - Nov 1999 |
Keywords
- Approximate and exact estimators
- Autoregressive and moving average error models
- Cholesky decomposition
- Computational convenience
- Generalized least squares and maximum likelihood estimators
- Linear and nonlinear optimization methods
- Transformations to obtain uncorrelated errors
ASJC Scopus subject areas
- Statistics and Probability
- General Mathematics
- Statistics, Probability and Uncertainty