As we move away from the data, the predictive uncertainty should increase, since a great variety of explanations are consistent with the little available information. We introduce Distance-Aware Prior (DAP) calibration, a method to correct overconfidence of Bayesian deep learning models outside of the training domain. We define DAPs as prior distributions over the model parameters that depend on the inputs through a measure of their distance from the training set. DAP calibration is agnostic to the posterior inference method, and it can be performed as a post-processing step. We demonstrate its effectiveness against several baselines in a variety of classification and regression problems, including benchmarks designed to test the quality of predictive distributions away from the data.
翻译:随着我们远离数据,预测性不确定性应当增加,因为各种解释与少量可用信息是一致的。我们引入远程软件先验(DAP)校准(DAP),这是在培训领域之外纠正贝叶斯人深层学习模型过度自信的一种方法。我们把DAP定义为在模型参数上先前的分布,这些参数取决于投入与培训成套材料的距离。DAP校准对于后推推法是不可知的,可以作为一种后处理步骤进行。我们在各种分类和回归问题中展示了它相对于若干基线的有效性,包括用来测试远离数据的预测性分布质量的基准。