Uncertainty quantification of machine learning and deep learning methods plays an important role in enhancing trust to the obtained result. In recent years, a numerous number of uncertainty quantification methods have been introduced. Monte Carlo dropout (MC-Dropout) is one of the most well-known techniques to quantify uncertainty in deep learning methods. In this study, we propose two new loss functions by combining cross entropy with Expected Calibration Error (ECE) and Predictive Entropy (PE). The obtained results clearly show that the new proposed loss functions lead to having a calibrated MC-Dropout method. Our results confirmed the great impact of the new hybrid loss functions for minimising the overlap between the distributions of uncertainty estimates for correct and incorrect predictions without sacrificing the model's overall performance.
翻译:机械学习和深层学习方法的不确定性量化在增进对所获结果的信任方面起着重要作用。近年来,采用了许多不确定性量化方法。蒙特卡洛辍学(MC-Dropout)是用来量化深层学习方法不确定性的最著名技术之一。在本研究中,我们提出两项新的损失功能,即将交叉加密与预期校准错误(ECE)和预测性负载(PE)相结合。获得的结果清楚地表明,新的拟议损失功能导致一种校准的MC-Dropout方法。我们的结果证实了新的混合损失功能的巨大影响,即在不牺牲模型总体性能的情况下,尽量减少用于正确和不正确预测的不确定性估计数分布之间的重叠。