Data augmentation is a simple yet effective way to improve the robustness of deep neural networks (DNNs). Diversity and hardness are two complementary dimensions of data augmentation to achieve robustness. For example, AugMix explores random compositions of a diverse set of augmentations to enhance broader coverage, while adversarial training generates adversarially hard samples to spot the weakness. Motivated by this, we propose a data augmentation framework, termed AugMax, to unify the two aspects of diversity and hardness. AugMax first randomly samples multiple augmentation operators and then learns an adversarial mixture of the selected operators. Being a stronger form of data augmentation, AugMax leads to a significantly augmented input distribution which makes model training more challenging. To solve this problem, we further design a disentangled normalization module, termed DuBIN (Dual-Batch-and-Instance Normalization), that disentangles the instance-wise feature heterogeneity arising from AugMax. Experiments show that AugMax-DuBIN leads to significantly improved out-of-distribution robustness, outperforming prior arts by 3.03%, 3.49%, 1.82% and 0.71% on CIFAR10-C, CIFAR100-C, Tiny ImageNet-C and ImageNet-C. Codes and pretrained models are available: https://github.com/VITA-Group/AugMax.
翻译:增强数据是提高深神经网络(DNNS)稳健性的一个简单而有效的方法。多样性和硬性是数据增强的两个互补维度,以实现稳健性。例如,AugMix探索多种增强系统的随机构成,以扩大覆盖范围,而对抗性培训则产生对抗性强的样本,以辨别弱点。为此,我们提议了一个数据增强框架,称为Augmax,以统一多样性和硬性的两个方面。Augmax首先随机抽样多个增强操作器,然后学习选定操作器的敌对混合体。AugMax是数据增强的较强形式,AugMax导致大量增加投入分配,使示范培训更具挑战性。为解决这一问题,我们进一步设计了一个不连贯的正常化模块,称为DuBIN(双批和内向-内向常态常态化),从而将奥卡克斯/杜拜因特异性特征混杂起来。实验显示,Augmax-DuBIN导致显著改进了分配的稳健性,比前艺术表现了3.03、3.49 %和IC-FARC-IMFAR10%、IMFAR-IBC-IBC/IBES1.82%、ILADRADRADRILM-ILDRBES 1.82)。