We present an efficient semiparametric variational method to approximate the Gibbs posterior distribution of Bayesian regression models, which predict the data through a linear combination of the available covariates. Remarkable cases are generalized linear mixed models, support vector machines, quantile and expectile regression. The variational optimization algorithm we propose only involves the calculation of univariate numerical integrals, when no analytic solutions are available. Neither differentiability, nor conjugacy, nor elaborate data-augmentation strategies are required. Several generalizations of the proposed approach are discussed in order to account for additive models, shrinkage priors, dynamic and spatial models, providing a unifying framework for statistical learning that cover a wide range of applications. The properties of our semiparametric variational approximation are then assessed through a theoretical analysis and an extensive simulation study, in which we compare our proposal with Markov chain Monte Carlo, conjugate mean field variational Bayes and Laplace approximation in terms of signal reconstruction, posterior approximation accuracy and execution time. A real data example is then presented through a probabilistic load forecasting application on the US power load consumption data.
翻译:我们提出了一种有效的半参数变异方法,以近似于Gibbs Bayesian回归模型的事后分布,该模型通过现有共差的线性组合来预测数据。显著的案例是通用的线性混合模型、支持矢量机器、四分位和预期回归。我们提议的变异优化算法仅涉及在没有解析解决方案的情况下计算单体数字组合。不需要差异性、共和性或详细的数据增强战略。讨论了拟议方法的一些概括性,以核算添加模型、缩缩缩前、动态和空间模型,为涵盖广泛应用的统计学习提供一个统一框架。然后通过理论分析和广泛的模拟研究来评估我们半对准变近值的特性。在理论分析和广泛的模拟研究中,我们将我们的提案与Markov链 Monte Carlo、 conjugate 平均场变异性海湾和Laplace 近似性信号重建、后向近似精确度和执行时间进行比较。然后通过美国电荷消耗数据的可比较性负载数据预测性负荷预测应用来展示真实的数据。