Uncertainty awareness is crucial to develop reliable machine learning models. In this work, we propose the Natural Posterior Network (NatPN) for fast and high-quality uncertainty estimation for any task where the target distribution belongs to the exponential family. Thus, NatPN finds application for both classification and general regression settings. Unlike many previous approaches, NatPN does not require out-of-distribution (OOD) data at training time. Instead, it leverages Normalizing Flows to fit a single density on a learned low-dimensional and task-dependent latent space. For any input sample, NatPN uses the predicted likelihood to perform a Bayesian update over the target distribution. Theoretically, NatPN assigns high uncertainty far away from training data. Empirically, our extensive experiments on calibration and OOD detection show that NatPN delivers highly competitive performance for classification, regression and count prediction tasks.
翻译:不确定意识对于开发可靠的机器学习模型至关重要。 在这项工作中,我们建议自然水底网(自然水底网)为目标分布属于指数式家庭的任何任务提供快速和高质量的不确定性估计。 因此,自然水底网对分类和一般回归设置都适用。 与以往的许多方法不同,自然水底网在培训时并不要求分配外数据。 相反,它利用正常化流程来适应学习的低维和任务依赖性潜质空间的单一密度。 对于任何输入样本,自然水底网利用预测的可能性对目标分布进行巴耶斯式更新。理论上,自然水底网给远离培训数据的高度不确定性。我们关于校准和 OOD 检测的广泛实验生动地表明,自然水底网在分类、回归和计算预测任务方面提供高度竞争性的性能。