BART: Bayesian additive regression trees

Hugh A. Chipman, Edward I. George, Robert E. McCulloch

Research output: Contribution to journalArticlepeer-review

752 Scopus citations

Abstract

We develop a Bayesian "sum-of-trees" model where each tree is constrained by a regularization prior to be a weak learner, and fitting and inference are accomplished via an iterative Bayesian backfitting MCMC algorithm that generates samples from a posterior. Effectively, BART is a nonparametric Bayesian regression approach which uses dimensionally adaptive random basis elements. Motivated by ensemble methods in general, and boosting algorithms in particular, BART is defined by a statistical model: a prior and a likelihood. This approach enables full posterior inference including point and interval estimates of the unknown regression function as well as the marginal effects of potential predictors. By keeping track of predictor inclusion frequencies, BART can also be used for model-free variable selection. BART's many features are illustrated with a bake-off against competing methods on 42 different data sets, with a simulation experiment and on a drug discovery classification problem.

Original languageEnglish (US)
Pages (from-to)266-298
Number of pages33
JournalAnnals of Applied Statistics
Volume6
Issue number1
DOIs
StatePublished - Mar 2012
Externally publishedYes

Keywords

  • Bayesian backfitting
  • Boosting
  • CART
  • Classification
  • Ensemble
  • MCMC
  • Nonparametric regression
  • Probit model
  • Random basis
  • Regularizatio
  • Sum-of-trees model
  • Variable selection
  • Weak learner

ASJC Scopus subject areas

  • Statistics and Probability
  • Modeling and Simulation
  • Statistics, Probability and Uncertainty

Fingerprint

Dive into the research topics of 'BART: Bayesian additive regression trees'. Together they form a unique fingerprint.

Cite this