Deep neural networks are prone to overconfident predictions on outliers. Bayesian neural networks and deep ensembles have both been shown to mitigate this problem to some extent. In this work, we aim to combine the benefits of the two approaches by proposing to predict with a Gaussian mixture model posterior that consists of a weighted sum of Laplace approximations of independently trained deep neural networks. The method can be used post hoc with any set of pre-trained networks and only requires a small computational and memory overhead compared to regular ensembles. We theoretically validate that our approach mitigates overconfidence "far away" from the training data and empirically compare against state-of-the-art baselines on standard uncertainty quantification benchmarks.
翻译:深海神经网络很容易对外部神经网络作出过于自信的预测。 贝耶斯神经网络和深层集合都证明在某种程度上缓解了这一问题。 在这项工作中,我们的目标是将这两种方法的效益结合起来,提议与高斯混合模型的外表模型一起进行预测,该模型由独立训练的深神经网络的拉普尔近似加权总和组成。 这种方法可以用任何一套预先训练的网络来临时使用,只需要与普通的集合相比一个小的计算和内存间接费用。 我们理论上证实,我们的方法会从培训数据中减少“远处”的过度信任,并与标准不确定性量化基准的最新基线进行实验性比较。