Deep convolutional neural network (CNN) based models are vulnerable to the adversarial attacks. One of the possible reasons is that the embedding space of CNN based model is sparse, resulting in a large space for the generation of adversarial samples. In this study, we propose a method, denoted as Dynamic Feature Aggregation, to compress the embedding space with a novel regularization. Particularly, the convex combination between two samples are regarded as the pivot for aggregation. In the embedding space, the selected samples are guided to be similar to the representation of the pivot. On the other side, to mitigate the trivial solution of such regularization, the last fully-connected layer of the model is replaced by an orthogonal classifier, in which the embedding codes for different classes are processed orthogonally and separately. With the regularization and orthogonal classifier, a more compact embedding space can be obtained, which accordingly improves the model robustness against adversarial attacks. An averaging accuracy of 56.91% is achieved by our method on CIFAR-10 against various attack methods, which significantly surpasses a solid baseline (Mixup) by a margin of 37.31%. More surprisingly, empirical results show that, the proposed method can also achieve the state-of-the-art performance for out-of-distribution (OOD) detection, due to the learned compact feature space. An F1 score of 0.937 is achieved by the proposed method, when adopting CIFAR-10 as in-distribution (ID) dataset and LSUN as OOD dataset. Code is available at https://github.com/HaozheLiu-ST/DynamicFeatureAggregation.
翻译:深相神经网络(CNN) 基于深相神经网络(CNN) 的模型容易受到对抗性攻击的伤害。 其中一个可能的原因可能是CNN基于模型的嵌入空间很稀少, 导致生成对抗性样本的空间很大。 在本研究中, 我们提出一种方法, 称为动态地貌聚合, 将嵌入空间压缩成新的正规化。 特别是, 两个样本之间的锥形结合被视为集中式。 在嵌入空间中, 选中的样本被引导为类似于主体的表示。 在另一方面, 为了减轻这种正规化的微小解决方案, 该模型中最后一个完全连接的层被一个正向型分类器替换。 在对不同类别的嵌入代码进行处理时, 以“ 动态地貌” 和“ 结构” 分类组合式组合式组合式组合式组合式组合, 从而改进模型在对抗性攻击时的坚固性能。 我们用CFARFAR- 10方法对各种攻击方法实现了56.91%的平均精确度, 以远超过IMD- Deal- drealalalalalalalation的底值, 在IM- 311 数据中, 以最精确的轨方法中, 的轨平比值显示,, 以Ral- deal- dal- dal- deal- dal- dal- dal- dal- dal- dal- d- d- dal- dal- dal- dal- sal- sal- salationalational- dal- dal- dal- sal- lad- salational- laxal- lad- sald- dal-d- sald-d-d- sal- sal-d-d- sald- sald-d-legal-d-legal-d-d-legal-legald-legal-legal-legald-legald-lational-l-l-lation lad-d-ld-l-lemental-lemental-lemental-ld-ld-leal-ld-lational-ld-l-lation lax-le-lation-l