K-Nearest Neighbor (kNN)-based deep learning methods have been applied to many applications due to their simplicity and geometric interpretability. However, the robustness of kNN-based classification models has not been thoroughly explored and kNN attack strategies are underdeveloped. In this paper, we propose an Adversarial Soft kNN (ASK) loss to both design more effective kNN attack strategies and to develop better defenses against them. Our ASK loss approach has two advantages. First, ASK loss can better approximate the kNN's probability of classification error than objectives proposed in previous works. Second, the ASK loss is interpretable: it preserves the mutual information between the perturbed input and the in-class-reference data. We use the ASK loss to generate a novel attack method called the ASK-Attack (ASK-Atk), which shows superior attack efficiency and accuracy degradation relative to previous kNN attacks. Based on the ASK-Atk, we then derive an ASK-\underline{Def}ense (ASK-Def) method that optimizes the worst-case training loss induced by ASK-Atk. Experiments on CIFAR-10 (ImageNet) show that (i) ASK-Atk achieves $\geq 13\%$ ($\geq 13\%$) improvement in attack success rate over previous kNN attacks, and (ii) ASK-Def outperforms the conventional adversarial training method by $\geq 6.9\%$ ($\geq 3.5\%$) in terms of robustness improvement.
翻译:K- Nearest Neearbor (kNN) 基于 K- Neearbor (ASK) 的深层次学习方法由于简单和几何解释性而应用到许多应用中。 但是,基于 kNN 的分类模型的稳健性尚未得到彻底探讨, kNN 攻击战略也不发达。 在本文中,我们提议对设计更有效的 kNNN 攻击战略并发展更好的防御方法进行Aversarial Soft kNN(ASK) 损失。 我们的ASK损失方法有两个优点。 首先, ASK$ 损失可以比先前工程中提议的目标更接近 KNNN的分类错误概率。 其次, ASK损失是可以解释的: 它保存了在绕动输入和类内参照数据数据的数据之间的相互信息。 我们用ASK损失来生成一种叫做ASK- Atack (ASK-AK-Q) (ASQQ) AS- AS- ASQ_Q (ASQ) AS- AS- ASQ arrain agreal acrestrual destrual destrual astrual AS- AS- astrual AS- AS- AS- AS- AS- AS- ASqqqqqq) astrualemstrualemstrualemstrualemstrual destrual astrual a astrual a a astrupal astrual astrual astrit astrup AS- astrualemstrual AS- AS- AS- astrual res ag) res astr AS- astrup AS- res astr AS- a re restr res a a a a restr res a res a res a a res a trem a a a a a a AS- agalemstremstremstrememstremstrem AS- a a a a a trememememememememememem a a a a a a a a a a tremememem a a a a a a AS- a tremem AS- a a a a a a a a a a a