Understanding time-series regression estimators

Askar H. Choudhury, Robert Hubata, Robert St Louis

Research output: Contribution to journalArticle

19 Citations (Scopus)

Abstract

A large number of methods have been developed for estimating time-series regression parameters. Students and practitioners have a difficult time understanding what these various methods are, let alone picking the most appropriate one for their application. This article explains how these methods are related. A chronology for the development of the various methods is presented, followed by a logical characterization of the methods. An examination of current computational techniques and computing power leads to the conclusion that exact maximum likelihood estimators should be used in almost all cases where regression models have autoregressive, moving average, or mixed autoregressive-moving average error structures.

Original languageEnglish (US)
Pages (from-to)342-348
Number of pages7
JournalAmerican Statistician
Volume53
Issue number4
StatePublished - Nov 1999

Fingerprint

Regression Estimator
Time series
Autoregressive Moving Average
Computational Techniques
Maximum Likelihood Estimator
Regression Model
Regression
Estimator
Computing
Autoregressive moving average

Keywords

  • Approximate and exact estimators
  • Autoregressive and moving average error models
  • Cholesky decomposition
  • Computational convenience
  • Generalized least squares and maximum likelihood estimators
  • Linear and nonlinear optimization methods

ASJC Scopus subject areas

  • Mathematics(all)
  • Statistics and Probability

Cite this

Choudhury, A. H., Hubata, R., & St Louis, R. (1999). Understanding time-series regression estimators. American Statistician, 53(4), 342-348.

Understanding time-series regression estimators. / Choudhury, Askar H.; Hubata, Robert; St Louis, Robert.

In: American Statistician, Vol. 53, No. 4, 11.1999, p. 342-348.

Research output: Contribution to journalArticle

Choudhury, AH, Hubata, R & St Louis, R 1999, 'Understanding time-series regression estimators', American Statistician, vol. 53, no. 4, pp. 342-348.
Choudhury, Askar H. ; Hubata, Robert ; St Louis, Robert. / Understanding time-series regression estimators. In: American Statistician. 1999 ; Vol. 53, No. 4. pp. 342-348.
@article{34aca1532af1485597af17101e1462c4,
title = "Understanding time-series regression estimators",
abstract = "A large number of methods have been developed for estimating time-series regression parameters. Students and practitioners have a difficult time understanding what these various methods are, let alone picking the most appropriate one for their application. This article explains how these methods are related. A chronology for the development of the various methods is presented, followed by a logical characterization of the methods. An examination of current computational techniques and computing power leads to the conclusion that exact maximum likelihood estimators should be used in almost all cases where regression models have autoregressive, moving average, or mixed autoregressive-moving average error structures.",
keywords = "Approximate and exact estimators, Autoregressive and moving average error models, Cholesky decomposition, Computational convenience, Generalized least squares and maximum likelihood estimators, Linear and nonlinear optimization methods",
author = "Choudhury, {Askar H.} and Robert Hubata and {St Louis}, Robert",
year = "1999",
month = "11",
language = "English (US)",
volume = "53",
pages = "342--348",
journal = "American Statistician",
issn = "0003-1305",
publisher = "American Statistical Association",
number = "4",

}

TY - JOUR

T1 - Understanding time-series regression estimators

AU - Choudhury, Askar H.

AU - Hubata, Robert

AU - St Louis, Robert

PY - 1999/11

Y1 - 1999/11

N2 - A large number of methods have been developed for estimating time-series regression parameters. Students and practitioners have a difficult time understanding what these various methods are, let alone picking the most appropriate one for their application. This article explains how these methods are related. A chronology for the development of the various methods is presented, followed by a logical characterization of the methods. An examination of current computational techniques and computing power leads to the conclusion that exact maximum likelihood estimators should be used in almost all cases where regression models have autoregressive, moving average, or mixed autoregressive-moving average error structures.

AB - A large number of methods have been developed for estimating time-series regression parameters. Students and practitioners have a difficult time understanding what these various methods are, let alone picking the most appropriate one for their application. This article explains how these methods are related. A chronology for the development of the various methods is presented, followed by a logical characterization of the methods. An examination of current computational techniques and computing power leads to the conclusion that exact maximum likelihood estimators should be used in almost all cases where regression models have autoregressive, moving average, or mixed autoregressive-moving average error structures.

KW - Approximate and exact estimators

KW - Autoregressive and moving average error models

KW - Cholesky decomposition

KW - Computational convenience

KW - Generalized least squares and maximum likelihood estimators

KW - Linear and nonlinear optimization methods

UR - http://www.scopus.com/inward/record.url?scp=0033484633&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0033484633&partnerID=8YFLogxK

M3 - Article

AN - SCOPUS:0033484633

VL - 53

SP - 342

EP - 348

JO - American Statistician

JF - American Statistician

SN - 0003-1305

IS - 4

ER -