Bayesian ensemble learning

Hugh A. Chipman, Edward I. George, Robert E. McCulloch

Research output: Chapter in Book/Report/Conference proceedingConference contribution

63 Scopus citations

Abstract

We develop a Bayesian "sum-of-trees" model, named BART, where each tree is constrained by a prior to be a weak learner. Fitting and inference are accomplished via an iterative backfitting MCMC algorithm. This model is motivated by ensemble methods in general, and boosting algorithms in particular. Like boosting, each weak learner (i.e., each weak tree) contributes a small amount to the overall model. However, our procedure is defined by a statistical model: a prior and a likelihood, while boosting is defined by an algorithm. This model-based approach enables a full and accurate assessment of uncertainty in model predictions, while remaining highly competitive in terms of predictive accuracy.

Original languageEnglish (US)
Title of host publicationAdvances in Neural Information Processing Systems 19 - Proceedings of the 2006 Conference
Pages265-272
Number of pages8
StatePublished - 2007
Externally publishedYes
Event20th Annual Conference on Neural Information Processing Systems, NIPS 2006 - Vancouver, BC, Canada
Duration: Dec 4 2006Dec 7 2006

Publication series

NameAdvances in Neural Information Processing Systems
ISSN (Print)1049-5258

Other

Other20th Annual Conference on Neural Information Processing Systems, NIPS 2006
Country/TerritoryCanada
CityVancouver, BC
Period12/4/0612/7/06

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Fingerprint

Dive into the research topics of 'Bayesian ensemble learning'. Together they form a unique fingerprint.

Cite this