We introduce ensembles of stochastic neural networks to approximate the Bayesian posterior, combining stochastic methods such as dropout with deep ensembles. The stochastic ensembles are formulated as families of distributions and trained to approximate the Bayesian posterior with variational inference. We implement stochastic ensembles based on Monte Carlo dropout, DropConnect and a novel non-parametric version of dropout and evaluate them on a toy problem and CIFAR image classification. For both tasks, we test the quality of the posteriors directly against Hamiltonian Monte Carlo simulations. Our results show that stochastic ensembles provide more accurate posterior estimates than other popular baselines for Bayesian inference.
翻译:我们介绍了一种使用随机神经网络集成逼近贝叶斯后验概率的方法,结合了Dropout等随机方法和深度集成方法。我们将随机集成网络作为分布族来进行训练,利用变分推断来逼近贝叶斯后验概率。我们基于Monte Carlo dropout、DropConnect以及一种新的非参数dropout版本实现了随机集成,并在玩具问题和CIFAR图像分类上进行了评估。对于这两个任务,我们直接根据汉密尔顿蒙特卡罗模拟测试了后验概率的质量。我们的结果表明,与其他流行的贝叶斯推理基线相比,随机集成提供了更准确的后验概率估计。