Despite the major progress of deep models as learning machines, uncertainty estimation remains a major challenge. Existing solutions rely on modified loss functions or architectural changes. We propose to compensate for the lack of built-in uncertainty estimates by supplementing any network, retrospectively, with a subsequent vine copula model, in an overall compound we call Vine-Copula Neural Network (VCNN). Through synthetic and real-data experiments, we show that VCNNs could be task (regression/classification) and architecture (recurrent, fully connected) agnostic while providing reliable and better-calibrated uncertainty estimates, comparable to state-of-the-art built-in uncertainty solutions.
翻译:尽管作为学习机器的深层模型取得了重大进步,但不确定性估计仍然是一个重大挑战。现有的解决方案依赖于经修改的损失功能或建筑变化。我们提议,通过追溯性地补充任何网络,以及随后的葡萄干模型,在一个我们称之为Vine-Copula神经网络(VCNN)的总体化合物中,弥补内在不确定性估算的缺乏。通过合成和真实数据实验,我们表明,VCNN可能是任务(递减/分类)和建筑(经常的、完全连接的)不可知性,同时提供可靠和更好的可校准的不确定性估算,与最先进的内在不确定性解决方案相比。</s>