Local model influence

Research output: Contribution to journalArticle

112 Citations (Scopus)

Abstract

This article develops a general method for assessing the influence of model assumptions in a Bayesian analysis. We assume that model choices are indexed by a hyperparameter with some given initial choice. We use the term “model” to encompass both the sampling model and the prior distribution. We wish to assess the effect of changing the hyperparameter away from the initial choice. We are performing a sensitivity analysis, with the hyperparameter defining our perturbations. We use the Kullback—Leibler divergence to measure the difference between posteriors corresponding to different choices of the hyperparameter. We also measure the change in priors. If small changes in the priors lead to large changes in posteriors, the choice of hyperparameter is influential. The second-order difference in the Kullback—Leibler divergence is expressed by Fisher information matrices. The relative change in posteriors compared with priors may be summarized by the relative eigenvalue of the posterior and prior Fisher information matrices. The corresponding eigenvector indicates which aspects of the perturbation hyperparameter are most influential. Examples considered are the choice of conjugate prior in regression, case weights in regression, and the choice of Dirichlet prior for multinomials.

Original languageEnglish (US)
Pages (from-to)473-478
Number of pages6
JournalJournal of the American Statistical Association
Volume84
Issue number406
DOIs
StatePublished - 1989
Externally publishedYes

Fingerprint

Hyperparameters
Fisher Information Matrix
Kullback-Leibler Divergence
Regression
Model
Dirichlet Prior
Perturbation
Model Choice
Conjugate prior
Prior Information
Bayesian Analysis
Prior distribution
Eigenvector
Sensitivity Analysis
Influence
Eigenvalue
Term
Kullback-Leibler divergence
Fisher information

Keywords

  • Diagnostics
  • Fisher information
  • Kullback—Leibler divergence
  • Posterior distribution
  • Predictive distribution

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Cite this

Local model influence. / McCulloch, Robert.

In: Journal of the American Statistical Association, Vol. 84, No. 406, 1989, p. 473-478.

Research output: Contribution to journalArticle

@article{0366e520452f4ede923312712fa947ff,
title = "Local model influence",
abstract = "This article develops a general method for assessing the influence of model assumptions in a Bayesian analysis. We assume that model choices are indexed by a hyperparameter with some given initial choice. We use the term “model” to encompass both the sampling model and the prior distribution. We wish to assess the effect of changing the hyperparameter away from the initial choice. We are performing a sensitivity analysis, with the hyperparameter defining our perturbations. We use the Kullback—Leibler divergence to measure the difference between posteriors corresponding to different choices of the hyperparameter. We also measure the change in priors. If small changes in the priors lead to large changes in posteriors, the choice of hyperparameter is influential. The second-order difference in the Kullback—Leibler divergence is expressed by Fisher information matrices. The relative change in posteriors compared with priors may be summarized by the relative eigenvalue of the posterior and prior Fisher information matrices. The corresponding eigenvector indicates which aspects of the perturbation hyperparameter are most influential. Examples considered are the choice of conjugate prior in regression, case weights in regression, and the choice of Dirichlet prior for multinomials.",
keywords = "Diagnostics, Fisher information, Kullback—Leibler divergence, Posterior distribution, Predictive distribution",
author = "Robert McCulloch",
year = "1989",
doi = "10.1080/01621459.1989.10478793",
language = "English (US)",
volume = "84",
pages = "473--478",
journal = "Journal of the American Statistical Association",
issn = "0162-1459",
publisher = "Taylor and Francis Ltd.",
number = "406",

}

TY - JOUR

T1 - Local model influence

AU - McCulloch, Robert

PY - 1989

Y1 - 1989

N2 - This article develops a general method for assessing the influence of model assumptions in a Bayesian analysis. We assume that model choices are indexed by a hyperparameter with some given initial choice. We use the term “model” to encompass both the sampling model and the prior distribution. We wish to assess the effect of changing the hyperparameter away from the initial choice. We are performing a sensitivity analysis, with the hyperparameter defining our perturbations. We use the Kullback—Leibler divergence to measure the difference between posteriors corresponding to different choices of the hyperparameter. We also measure the change in priors. If small changes in the priors lead to large changes in posteriors, the choice of hyperparameter is influential. The second-order difference in the Kullback—Leibler divergence is expressed by Fisher information matrices. The relative change in posteriors compared with priors may be summarized by the relative eigenvalue of the posterior and prior Fisher information matrices. The corresponding eigenvector indicates which aspects of the perturbation hyperparameter are most influential. Examples considered are the choice of conjugate prior in regression, case weights in regression, and the choice of Dirichlet prior for multinomials.

AB - This article develops a general method for assessing the influence of model assumptions in a Bayesian analysis. We assume that model choices are indexed by a hyperparameter with some given initial choice. We use the term “model” to encompass both the sampling model and the prior distribution. We wish to assess the effect of changing the hyperparameter away from the initial choice. We are performing a sensitivity analysis, with the hyperparameter defining our perturbations. We use the Kullback—Leibler divergence to measure the difference between posteriors corresponding to different choices of the hyperparameter. We also measure the change in priors. If small changes in the priors lead to large changes in posteriors, the choice of hyperparameter is influential. The second-order difference in the Kullback—Leibler divergence is expressed by Fisher information matrices. The relative change in posteriors compared with priors may be summarized by the relative eigenvalue of the posterior and prior Fisher information matrices. The corresponding eigenvector indicates which aspects of the perturbation hyperparameter are most influential. Examples considered are the choice of conjugate prior in regression, case weights in regression, and the choice of Dirichlet prior for multinomials.

KW - Diagnostics

KW - Fisher information

KW - Kullback—Leibler divergence

KW - Posterior distribution

KW - Predictive distribution

UR - http://www.scopus.com/inward/record.url?scp=25844453976&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=25844453976&partnerID=8YFLogxK

U2 - 10.1080/01621459.1989.10478793

DO - 10.1080/01621459.1989.10478793

M3 - Article

VL - 84

SP - 473

EP - 478

JO - Journal of the American Statistical Association

JF - Journal of the American Statistical Association

SN - 0162-1459

IS - 406

ER -