Model compression and model defense for deep neural networks (DNNs) have been extensively and individually studied. Considering the co-importance of model compactness and robustness in practical applications, several prior works have explored to improve the adversarial robustness of the sparse neural networks. However, the structured sparse models obtained by the exiting works suffer severe performance degradation for both benign and robust accuracy, thereby causing a challenging dilemma between robustness and structuredness of the compact DNNs. To address this problem, in this paper, we propose CSTAR, an efficient solution that can simultaneously impose the low-rankness-based Compactness, high STructuredness and high Adversarial Robustness on the target DNN models. By formulating the low-rankness and robustness requirement within the same framework and globally determining the ranks, the compressed DNNs can simultaneously achieve high compression performance and strong adversarial robustness. Evaluations for various DNN models on different datasets demonstrate the effectiveness of CSTAR. Compared with the state-of-the-art robust structured pruning methods, CSTAR shows consistently better performance. For instance, when compressing ResNet-18 on CIFAR-10, CSTAR can achieve up to 20.07% and 11.91% improvement for benign accuracy and robust accuracy, respectively. For compressing ResNet-18 with 16x compression ratio on Imagenet, CSTAR can obtain 8.58% benign accuracy gain and 4.27% robust accuracy gain compared to the existing robust structured pruning method.
翻译:深层神经网络的模型压缩和模型防御(DNNs)已经进行了广泛和个别的研究。考虑到模型紧凑性和稳健性在实际应用中具有共同重要性,一些先前的工程已经探索过改进稀疏神经网络的对抗性强性;然而,由于退出的工程所获得的结构分散模型在良性和稳健性两方面都面临严重的性能退化,从而在紧凑性DNNs的稳健性和结构性之间造成一个具有挑战性的难题。为了解决这个问题,我们在本文件中提出科技委,一个有效的解决方案可以同时将低排序基础的紧凑性、高结构严密性和高反向性强性强性强性强性强性强性地纳入目标的DNNNM模型。为了应对这一问题,我们提议,在本文件中,我们提出,一个高效的解决方案可以同时将低级别基于稳健性的结构调整性能、高结构化的精确性能和高度的Aversari性能性能在DNNWS-18模型模型中提高性能。例如,通过在同一框架内制定低级的稳健性要求和稳健健健性 ASAR 18 AS-10 SAL SANBAR 的准确性地提高性地将CW IP IP IP 4- 4-10 4-SALM 4-rml化的精确性能 4-SAmb性能 4-SAM-SALM-SA-SAM-SA-SA-SA-SA-SA-SA-SA-SA-SA 4-SA 4-SA 4-SA