For many real-world applications, obtaining stable and robust statistical performance is more important than simply achieving state-of-the-art predictive test accuracy, and thus robustness of neural networks is an increasingly important topic. Relatedly, data augmentation schemes have been shown to improve robustness with respect to input perturbations and domain shifts. Motivated by this, we introduce NoisyMix, a training scheme that combines data augmentations with stability training and noise injections to improve both model robustness and in-domain accuracy. This combination promotes models that are consistently more robust and that provide well-calibrated estimates of class membership probabilities. We demonstrate the benefits of NoisyMix on a range of benchmark datasets, including ImageNet-C, ImageNet-R, and ImageNet-P. Moreover, we provide theory to understand implicit regularization and robustness of NoisyMix.
翻译:对于许多现实世界应用来说,获得稳定和稳健的统计业绩比简单地实现最先进的预测测试准确性更重要,因此神经网络的稳健性是一个越来越重要的话题。与此相关的是,数据增强计划已证明提高了投入扰动和域变的稳健性。为此,我们引入了NoisyMix(NoisyMix)(NoisyMix)(NosyMix),这是一个将数据增强与稳定性培训和注入噪音相结合的培训计划,以提高模型的稳健性和主体的准确性。这一组合促进的模型始终更加稳健,并且能够提供分类成员概率的精确估计。我们展示了NoisyMix(NoisyMix)对一系列基准数据集(包括图像网络-C、图像网络-R和图像网络-P)的好处。此外,我们提供了理解NosyMix隐含的规范性和稳健性的理论。