CFE 2017: Start Registration
View Submission - CMStatistics
B0501
Title: Bayesian tree ensembles that adapt to smoothness and sparsity Authors:  Antonio Linero - Florida State University (United States) [presenting]
Yun Yang - Florida State University (United States)
Abstract: Ensembles of decision trees are a useful tool for obtaining for obtaining flexible estimates of regression functions. Examples of these methods include gradient boosted decision trees, random forests, and Bayesian CART. A potential shortcoming of tree ensembles is their lack of smoothness; for example, to approximate a linear function, a single decision tree requires a large number of branches. We show that this problem can be mitigated by instead considering decision trees in which the decisions are treated as probabilistic. We implement this in the context of the Bayesian additive regression trees framework, and show that this approach gives large performance increases on commonly used benchmark problems. We provide theoretical support for our methodology by showing that the posterior concentrates at the minimax rate, up-to a logarithmic factor, for $\alpha$-H\"older functions, adaptively over $\alpha$. Our priors over tree structures also allow for adaptation to sparsity; combined with our smoothing of the decision trees, this allows the posterior to concentrate at near-optimal rates adaptively over many classes of functions, including $\alpha$-H\"older sparse functions and additive functions.