We propose the adversarially robust kernel smoothing (ARKS) algorithm, combining kernel smoothing, robust optimization, and adversarial training for robust learning. Our methods are motivated by the convex analysis perspective of distributionally robust optimization based on probability metrics, such as the Wasserstein distance and the maximum mean discrepancy. We adapt the integral operator using supremal convolution in convex analysis to form a novel function majorant used for enforcing robustness. Our method is simple in form and applies to general loss functions and machine learning models. Furthermore, we report experiments with general machine learning models, such as deep neural networks, to demonstrate that ARKS performs competitively with the state-of-the-art methods based on the Wasserstein distance.
翻译:我们建议采用对抗性强的内核滑动算法(ARKS),将内核滑动、强力优化和对称培训结合起来,以进行强健学习。我们的方法是基于基于概率度量的分布式强力优化分析视角,如瓦瑟斯坦距离和最大平均差幅。我们用螺旋变速法分析将整体操作者调整为用于实施稳健的新功能主机。我们的方法形式简单,适用于一般损失函数和机器学习模型。此外,我们报告用普通机器学习模型(如深神经网络)进行实验,以证明ARKS与基于瓦瑟斯坦距离的先进方法竞争。