Deep neural networks tend to underestimate uncertainty and produce overly confident predictions. Recently proposed solutions, such as MC Dropout and SDENet, require complex training and/or auxiliary out-of-distribution data. We propose a simple solution by extending the time-tested iterative reweighted least square (IRLS) in generalised linear regression. We use two sub-networks to parametrise the prediction and uncertainty estimation, enabling easy handling of complex inputs and nonlinear response. The two sub-networks have shared representations and are trained via two complementary loss functions for the prediction and the uncertainty estimates, with interleaving steps as in a cooperative game. Compared with more complex models such as MC-Dropout or SDE-Net, our proposed network is simpler to implement and more robust (insensitive to varying aleatoric and epistemic uncertainty).
翻译:深神经网络往往低估不确定性并产生过于自信的预测。 最近提出的解决方案,如MC 辍学和SDENet,需要复杂的培训和/或辅助性分配外数据。我们提出了一个简单的解决方案,在一般的线性回归中延长经过时间测试的迭代再加权最小平方(IRLS ) 。我们使用两个子网络来弥补预测和不确定性估计,便于处理复杂的投入和非线性反应。这两个子网络有共同的表述,并通过两个互补的损失函数来培训预测和不确定性估计,同时在合作的游戏中采取相互交错的步骤。与诸如MC-Dropout或SDE-Net等更为复杂的模型相比,我们提议的网络比较容易实施,而且更加有力(对不同的测距和传记不确定性不敏感 ) 。