TY - JOUR
T1 - Local model influence
AU - McCulloch, Robert E.
N1 - Funding Information:
• Robert E. McCulloch is Assistant Professor of Statistics, Graduate School of Business, University of Chicago, Chicago IL 60637.This work was partially funded by National Institutes of Health Grant NIGMS 25271 and the University of Chicago Graduate School of Business. The author thanks Seymour Geisser, Dennis Cook, Jim Hodges, and Arnold Zellner, as well as the referees, for many helpful comments.
PY - 1989/6
Y1 - 1989/6
N2 - This article develops a general method for assessing the influence of model assumptions in a Bayesian analysis. We assume that model choices are indexed by a hyperparameter with some given initial choice. We use the term “model” to encompass both the sampling model and the prior distribution. We wish to assess the effect of changing the hyperparameter away from the initial choice. We are performing a sensitivity analysis, with the hyperparameter defining our perturbations. We use the Kullback—Leibler divergence to measure the difference between posteriors corresponding to different choices of the hyperparameter. We also measure the change in priors. If small changes in the priors lead to large changes in posteriors, the choice of hyperparameter is influential. The second-order difference in the Kullback—Leibler divergence is expressed by Fisher information matrices. The relative change in posteriors compared with priors may be summarized by the relative eigenvalue of the posterior and prior Fisher information matrices. The corresponding eigenvector indicates which aspects of the perturbation hyperparameter are most influential. Examples considered are the choice of conjugate prior in regression, case weights in regression, and the choice of Dirichlet prior for multinomials.
AB - This article develops a general method for assessing the influence of model assumptions in a Bayesian analysis. We assume that model choices are indexed by a hyperparameter with some given initial choice. We use the term “model” to encompass both the sampling model and the prior distribution. We wish to assess the effect of changing the hyperparameter away from the initial choice. We are performing a sensitivity analysis, with the hyperparameter defining our perturbations. We use the Kullback—Leibler divergence to measure the difference between posteriors corresponding to different choices of the hyperparameter. We also measure the change in priors. If small changes in the priors lead to large changes in posteriors, the choice of hyperparameter is influential. The second-order difference in the Kullback—Leibler divergence is expressed by Fisher information matrices. The relative change in posteriors compared with priors may be summarized by the relative eigenvalue of the posterior and prior Fisher information matrices. The corresponding eigenvector indicates which aspects of the perturbation hyperparameter are most influential. Examples considered are the choice of conjugate prior in regression, case weights in regression, and the choice of Dirichlet prior for multinomials.
KW - Diagnostics
KW - Fisher information
KW - Kullback—Leibler divergence
KW - Posterior distribution
KW - Predictive distribution
UR - http://www.scopus.com/inward/record.url?scp=25844453976&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=25844453976&partnerID=8YFLogxK
U2 - 10.1080/01621459.1989.10478793
DO - 10.1080/01621459.1989.10478793
M3 - Article
AN - SCOPUS:25844453976
SN - 0162-1459
VL - 84
SP - 473
EP - 478
JO - Journal of the American Statistical Association
JF - Journal of the American Statistical Association
IS - 406
ER -