Recent studies show that training deep neural networks (DNNs) with Lipschitz constraints are able to enhance adversarial robustness and other model properties such as stability. In this paper, we propose a layer-wise orthogonal training method (LOT) to effectively train 1-Lipschitz convolution layers via parametrizing an orthogonal matrix with an unconstrained matrix. We then efficiently compute the inverse square root of a convolution kernel by transforming the input domain to the Fourier frequency domain. On the other hand, as existing works show that semi-supervised training helps improve empirical robustness, we aim to bridge the gap and prove that semi-supervised learning also improves the certified robustness of Lipschitz-bounded models. We conduct comprehensive evaluations for LOT under different settings. We show that LOT significantly outperforms baselines regarding deterministic l2 certified robustness, and scales to deeper neural networks. Under the supervised scenario, we improve the state-of-the-art certified robustness for all architectures (e.g. from 59.04% to 63.50% on CIFAR-10 and from 32.57% to 34.59% on CIFAR-100 at radius rho = 36/255 for 40-layer networks). With semi-supervised learning over unlabelled data, we are able to improve state-of-the-art certified robustness on CIFAR-10 at rho = 108/255 from 36.04% to 42.39%. In addition, LOT consistently outperforms baselines on different model architectures with only 1/3 evaluation time.
翻译:36. 最近的研究显示,利用Lipschitz限制条件培训深神经网络(DNN)能够增强对抗性强力和其他模型特性,如稳定性。在本文中,我们提出一个分层整形培训方法(LOT),以便通过一个不受限制的矩阵,对正心矩阵进行匹配,对1-Lipschitz进化层进行有效培训。然后,我们通过将输入域转换到Fourier频率域,有效地计算进动内核的反平方根。另一方面,正如现有的工程显示半监督培训有助于提高经验强力和其他模型特性。我们的目标是弥合差距,并证明半监督性学习也能提高Lipschitz进化模型的认证强度。我们在不同环境下对LOT进行全面评估。我们发现,LOT大大超出了关于确定性二项认证坚固度的基线,以及更深的神经网络。在监督的情景下,我们改进了所有建筑结构(从59.4%到63.59的硬性时间范围,在IMFIRR-10的IMLRR-258中,从一个不固定的IM-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-