Parallel Bayesian Additive Regression Trees

Matthew T. Pratola, Hugh A. Chipman, James R. Gattiker, David M. Higdon, Robert McCulloch, William N. Rust

Research output: Contribution to journalArticlepeer-review

39 Scopus citations

Abstract

Bayesian additive regression trees (BART) is a Bayesian approach to flexible nonlinear regression which has been shown to be competitive with the best modern predictive methods such as those based on bagging and boosting. BART offers some advantages. For example, the stochastic search Markov chain Monte Carlo (MCMC) algorithm can provide a more complete search of the model space and variation across MCMC draws can capture the level of uncertainty in the usual Bayesian way. The BART prior is robust in that reasonable results are typically obtained with a default prior specification. However, the publicly available implementation of the BART algorithm in the R package BayesTree is not fast enough to be considered interactive with over a thousand observations, and is unlikely to even run with 50,000 to 100,000 observations. In this article we show how the BART algorithm may be modified and then computed using single program, multiple data (SPMD) parallel computation implemented using the Message Passing Interface (MPI) library. The approach scales nearly linearly in the number of processor cores, enabling the practitioner to perform statistical inference on massive datasets. Our approach can also handle datasets too massive to fit on any single data repository.

Original languageEnglish (US)
Pages (from-to)830-852
Number of pages23
JournalJournal of Computational and Graphical Statistics
Volume23
Issue number3
DOIs
StatePublished - Jul 3 2014
Externally publishedYes

Keywords

  • Big Data
  • Markov chain Monte Carlo
  • Nonlinear
  • Scalable
  • Statistical computing

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty
  • Discrete Mathematics and Combinatorics

Fingerprint

Dive into the research topics of 'Parallel Bayesian Additive Regression Trees'. Together they form a unique fingerprint.

Cite this