This paper develops a novel stochastic tree ensemble method for nonlinear regression, referred to as Accelerated Bayesian Additive Regression Trees, or XBART. By combining regularization and stochastic search strategies from Bayesian modeling with computationally efficient techniques from recursive partitioning algorithms, XBART attains state-of-the-art performance at prediction and function estimation. Simulation studies demonstrate that XBART provides accurate point-wise estimates of the mean function and does so faster than popular alternatives, such as BART, XGBoost, and neural networks (using Keras) on a variety of test functions. Additionally, it is demonstrated that using XBART to initialize the standard BART MCMC algorithm considerably improves credible interval coverage and reduces total run-time. Finally, two basic theoretical results are established: the single tree version of the model is asymptotically consistent and the Markov chain produced by the ensemble version of the algorithm has a unique stationary distribution.
翻译:本文为非线性回归开发了一种新型的随机树群共合法,称为加速巴伊西亚沉降树,或AXART。通过将巴伊西亚模型的正规化和随机搜索战略与循环分割算法的计算效率技术相结合,XBART在预测和函数估计方面达到了最先进的性能。模拟研究表明,XBART提供了对平均函数的准确的点数估计,而且比流行的替代方法,如BART、XGBoost和神经网络(使用Keras)在各种测试功能上的速度要快得多。此外,事实证明,使用XASTART初始化标准巴阿特MCMC算法大大改进了可靠的间隔覆盖,并减少了整个运行时间。最后,确定了两个基本理论结果:模型的单一树本与模一样一致,而由混合算法版本产生的Markov链具有独特的定点分布。