This paper examines the impact of static sparsity on the robustness of a trained network to weight perturbations, data corruption, and adversarial examples. We show that, up to a certain sparsity achieved by increasing network width and depth while keeping the network capacity fixed, sparsified networks consistently match and often outperform their initially dense versions. Robustness and accuracy decline simultaneously for very high sparsity due to loose connectivity between network layers. Our findings show that a rapid robustness drop caused by network compression observed in the literature is due to a reduced network capacity rather than sparsity.
翻译:本文审视了静态聚变对经过训练的网络的稳健性的影响,即重量扰动、数据腐败和对抗性实例。我们发现,通过提高网络宽度和深度,同时保持网络容量固定,封闭式网络始终与最初密集的版本相匹配,而且往往超过其最初的版本。由于网络层之间互连性松散,网络的稳健性和准确性同时下降。我们的研究结果表明,文献中观察到的网络压缩导致网络压缩导致的快速稳健性下降,是由于网络容量下降,而不是因为网络容量减少。