### Abstract

This article develops a general method for assessing the influence of model assumptions in a Bayesian analysis. We assume that model choices are indexed by a hyperparameter with some given initial choice. We use the term “model” to encompass both the sampling model and the prior distribution. We wish to assess the effect of changing the hyperparameter away from the initial choice. We are performing a sensitivity analysis, with the hyperparameter defining our perturbations. We use the Kullback—Leibler divergence to measure the difference between posteriors corresponding to different choices of the hyperparameter. We also measure the change in priors. If small changes in the priors lead to large changes in posteriors, the choice of hyperparameter is influential. The second-order difference in the Kullback—Leibler divergence is expressed by Fisher information matrices. The relative change in posteriors compared with priors may be summarized by the relative eigenvalue of the posterior and prior Fisher information matrices. The corresponding eigenvector indicates which aspects of the perturbation hyperparameter are most influential. Examples considered are the choice of conjugate prior in regression, case weights in regression, and the choice of Dirichlet prior for multinomials.

Original language | English (US) |
---|---|

Pages (from-to) | 473-478 |

Number of pages | 6 |

Journal | Journal of the American Statistical Association |

Volume | 84 |

Issue number | 406 |

DOIs | |

State | Published - 1989 |

Externally published | Yes |

### Fingerprint

### Keywords

- Diagnostics
- Fisher information
- Kullback—Leibler divergence
- Posterior distribution
- Predictive distribution

### ASJC Scopus subject areas

- Statistics and Probability
- Statistics, Probability and Uncertainty

### Cite this

**Local model influence.** / McCulloch, Robert.

Research output: Contribution to journal › Article

*Journal of the American Statistical Association*, vol. 84, no. 406, pp. 473-478. https://doi.org/10.1080/01621459.1989.10478793

}

TY - JOUR

T1 - Local model influence

AU - McCulloch, Robert

PY - 1989

Y1 - 1989

N2 - This article develops a general method for assessing the influence of model assumptions in a Bayesian analysis. We assume that model choices are indexed by a hyperparameter with some given initial choice. We use the term “model” to encompass both the sampling model and the prior distribution. We wish to assess the effect of changing the hyperparameter away from the initial choice. We are performing a sensitivity analysis, with the hyperparameter defining our perturbations. We use the Kullback—Leibler divergence to measure the difference between posteriors corresponding to different choices of the hyperparameter. We also measure the change in priors. If small changes in the priors lead to large changes in posteriors, the choice of hyperparameter is influential. The second-order difference in the Kullback—Leibler divergence is expressed by Fisher information matrices. The relative change in posteriors compared with priors may be summarized by the relative eigenvalue of the posterior and prior Fisher information matrices. The corresponding eigenvector indicates which aspects of the perturbation hyperparameter are most influential. Examples considered are the choice of conjugate prior in regression, case weights in regression, and the choice of Dirichlet prior for multinomials.

AB - This article develops a general method for assessing the influence of model assumptions in a Bayesian analysis. We assume that model choices are indexed by a hyperparameter with some given initial choice. We use the term “model” to encompass both the sampling model and the prior distribution. We wish to assess the effect of changing the hyperparameter away from the initial choice. We are performing a sensitivity analysis, with the hyperparameter defining our perturbations. We use the Kullback—Leibler divergence to measure the difference between posteriors corresponding to different choices of the hyperparameter. We also measure the change in priors. If small changes in the priors lead to large changes in posteriors, the choice of hyperparameter is influential. The second-order difference in the Kullback—Leibler divergence is expressed by Fisher information matrices. The relative change in posteriors compared with priors may be summarized by the relative eigenvalue of the posterior and prior Fisher information matrices. The corresponding eigenvector indicates which aspects of the perturbation hyperparameter are most influential. Examples considered are the choice of conjugate prior in regression, case weights in regression, and the choice of Dirichlet prior for multinomials.

KW - Diagnostics

KW - Fisher information

KW - Kullback—Leibler divergence

KW - Posterior distribution

KW - Predictive distribution

UR - http://www.scopus.com/inward/record.url?scp=25844453976&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=25844453976&partnerID=8YFLogxK

U2 - 10.1080/01621459.1989.10478793

DO - 10.1080/01621459.1989.10478793

M3 - Article

AN - SCOPUS:25844453976

VL - 84

SP - 473

EP - 478

JO - Journal of the American Statistical Association

JF - Journal of the American Statistical Association

SN - 0162-1459

IS - 406

ER -