We propose a new method for Bayesian prediction that caters for models with a large number of parameters and is robust to model misspecification. Given a class of high-dimensional (but parametric) predictive models, this new approach constructs a posterior predictive using a variational approximation to a loss-based, or Gibbs, posterior that is directly focused on predictive accuracy. The theoretical behavior of the new prediction approach is analyzed and a form of optimality demonstrated. Applications to both simulated and empirical data using high-dimensional Bayesian neural network and autoregressive mixture models demonstrate that the approach provides more accurate results than various alternatives, including misspecified likelihood-based predictions.
翻译:我们提出了一种新的贝叶斯预测方法,该方法满足了具有大量参数的模型,并且非常适合模型误差。鉴于有一类高维(但参数性)预测模型,这一新方法构建了一种后方预测,使用一种基于损失的或直接侧重于预测准确性的后方近似值,即Gibbs,后方的变异近似值。新预测方法的理论行为得到了分析,并展示了一种最佳性的形式。运用高维贝伊斯神经网络和自动递增混合模型对模拟和经验数据的应用表明,该方法比各种替代方法,包括错误的基于概率的预测,提供了更准确的结果。