Recent advances in adversarial attacks show the vulnerability of deep neural networks searched by Neural Architecture Search (NAS). Although NAS methods can find network architectures with the state-of-the-art performance, the adversarial robustness and resource constraint are often ignored in NAS. To solve this problem, we propose an Effective, Efficient, and Robust Neural Architecture Search (E2RNAS) method to search a neural network architecture by taking the performance, robustness, and resource constraint into consideration. The objective function of the proposed E2RNAS method is formulated as a bi-level multi-objective optimization problem with the upper-level problem as a multi-objective optimization problem, which is different from existing NAS methods. To solve the proposed objective function, we integrate the multiple-gradient descent algorithm, a widely studied gradient-based multi-objective optimization algorithm, with the bi-level optimization. Experiments on benchmark datasets show that the proposed E2RNAS method can find adversarially robust architectures with optimized model size and comparable classification accuracy.
翻译:对抗性攻击的最新进展表明神经结构搜索(NAS)所搜索的深层神经网络的脆弱性。尽管NAS方法可以找到具有最先进性能的网络结构,但在NAS中,对抗性强力和资源制约往往被忽视。为了解决这个问题,我们建议采用一种有效、高效和强力神经结构搜索(E2RNAS)方法,通过考虑到性能、稳健性和资源制约来搜索神经网络结构。拟议的E2RNAS方法的客观功能是作为顶级问题的一个双级多目标优化问题,作为一个多目标优化问题,与现有的NAS方法不同。为了解决拟议的目标功能,我们将多位位位化的下行算法,即经过广泛研究的梯度-多目标优化算法,与双级优化相结合。基准数据集实验显示,拟议的E2RNAS方法可以找到具有最优化模型大小和可比分类精确度的对抗性强结构。