Particle Gibbs for Bayesian additive regression trees

View Researcher's Other Codes

Disclaimer: The provided code links for this paper are external links. Science Nest has no responsibility for the accuracy, legality or content of these links. Also, by downloading this code(s), you agree to comply with the terms of use as set out by the author(s) of the code(s).

Authors Balaji Lakshminarayanan, Daniel M. Roy, Yee Whye Teh
Journal/Conference Name AISTATS
Paper Category
Paper Abstract Additive regression trees are flexible nonparametric models and popular o↵-the-shelf tools for real-world non-linear regression. In application domains, such as bioinformatics, where there is also demand for probabilistic predictions with measures of uncertainty, the Bayesian additive regression trees (BART) model, introduced by Chipman et al. (2010), is increasingly popular. As data sets have grown in size, however, the standard Metropolis–Hastings algorithms used to perform inference in BART are proving inadequate. In particular, these Markov chains make local changes to the trees and su↵er from slow mixing when the data are highdimensional or the best-fitting trees are more than a few layers deep. We present a novel sampler for BART based on the Particle Gibbs (PG) algorithm (Andrieu et al., 2010) and a top-down particle filtering algorithm for Bayesian decision trees (Lakshminarayanan et al., 2013). Rather than making local changes to individual trees, the PG sampler proposes a complete tree to fit the residual. Experiments show that the PG sampler outperforms existing samplers in many settings.
Date of publication 2015
Code Programming Language R
Comment

Copyright Researcher 2021