For many real-world applications, obtaining stable and robust statistical performance is more important than simply achieving state-of-the-art predictive test accuracy, and thus robustness of neural networks is an increasingly important topic. Relatedly, data augmentation schemes have been shown to improve robustness with respect to input perturbations and domain shifts. Motivated by this, we introduce NoisyMix, a novel training scheme that promotes stability as well as leverages noisy augmentations in input and feature space to improve both model robustness and in-domain accuracy. NoisyMix produces models that are consistently more robust and that provide well-calibrated estimates of class membership probabilities. We demonstrate the benefits of NoisyMix on a range of benchmark datasets, including ImageNet-C, ImageNet-R, and ImageNet-P. Moreover, we provide theory to understand implicit regularization and robustness of NoisyMix.
翻译:对于许多现实世界应用来说,获得稳定和稳健的统计业绩比简单地实现最新预测测试的准确性更重要,因此神经网络的稳健性是一个越来越重要的话题。与此相关的是,数据增强计划已证明提高了投入扰动和域变换方面的稳健性。我们为此引入了NoisyMix,这是一个促进稳定以及利用投入和特征空间的噪音放大来提高模型稳健性和常态准确性的新培训计划。NoisyMix 生成了一贯更加稳健且能对类别成员概率作出精确估计的模型。我们展示了NoisyMix在一系列基准数据集(包括图像网络-C、图像网络-R和图像网络-P)上的好处。此外,我们提供了理论来理解NosyMix的隐性规范性和稳健性。