Calibration can reduce overconfident predictions of deep neural networks, but can calibration also accelerate training? In this paper, we show that it can when used to prioritize some examples for performing subset selection. We study the effect of popular calibration techniques in selecting better subsets of samples during training (also called sample prioritization) and observe that calibration can improve the quality of subsets, reduce the number of examples per epoch (by at least 70%), and can thereby speed up the overall training process. We further study the effect of using calibrated pre-trained models coupled with calibration during training to guide sample prioritization, which again seems to improve the quality of samples selected.
翻译:校准可以减少对深神经网络的过度自信预测,但校准也可以加速培训?在本文中,我们表明,当用于确定进行子集选择的一些实例的优先顺序时,它可以用于研究普及校准技术在培训期间选择更好的样本子集(也称为样本优先排序)的影响,并观察到校准可以提高子集的质量,减少每个时代的实例数量(至少减少70%),从而加快整个培训过程。我们进一步研究了在培训期间使用经过校准的预培训模型,同时进行校准,以指导抽样优先排序,这似乎也提高了所选样本的质量。