A recent trend in Bayesian research has been revisiting generalizations of the likelihood that enable Bayesian inference without requiring the specification of a model for the data generating mechanism. This paper focuses on a Bayesian nonparametric extension of Wedderburn's quasi-likelihood, using Bayesian additive regression trees to model the mean function. Here, the analyst posits only a structural relationship between the mean and variance of the outcome. We show that this approach provides a unified, computationally efficient, framework for extending Bayesian decision tree ensembles to many new settings, including simplex-valued and heavily heteroskedastic data. We also introduce Bayesian strategies for inferring the dispersion parameter of the quasi-likelihood, a task which is complicated by the fact that the quasi-likelihood itself does not contain information about this parameter; despite these challenges, we are able to inject updates for the dispersion parameter into a Markov chain Monte Carlo inference scheme in a way that, in the parametric setting, leads to a Bernstein-von Mises result for the stationary distribution of the resulting Markov chain. We illustrate the utility of our approach on a variety of both synthetic and non-synthetic datasets.
翻译:暂无翻译