The most scalable approaches to certifying neural network robustness depend on computing sound linear lower and upper bounds for the network's activation functions. Current approaches are limited in that the linear bounds must be handcrafted by an expert, and can be sub-optimal, especially when the network's architecture composes operations using, for example, multiplication such as in LSTMs and the recently popular Swish activation. The dependence on an expert prevents the application of robustness certification to developments in the state-of-the-art of activation functions, and furthermore the lack of tightness guarantees may give a false sense of insecurity about a particular model. To the best of our knowledge, we are the first to consider the problem of automatically computing tight linear bounds for arbitrary n-dimensional activation functions. We propose LinSyn, the first approach that achieves tight bounds for any arbitrary activation function, while only leveraging the mathematical definition of the activation function itself. Our approach leverages an efficient heuristic approach to synthesize bounds that are tight and usually sound, and then verifies the soundness (and adjusts the bounds if necessary) using the highly optimized branch-and-bound SMT solver, dReal. Even though our approach depends on an SMT solver, we show that the runtime is reasonable in practice, and, compared with state of the art, our approach often achieves 2-5X tighter final output bounds and more than quadruple certified robustness.
翻译:证明神经网络稳健性的最可伸缩的方法取决于对网络启动功能的正确线性下下下下下和上限进行计算。 目前的方法是有限的,因为线性线性界限必须由专家手工制作,并且可以是亚最佳的,特别是当网络结构使用诸如LSTMS和最近受欢迎的 Swish 激活等倍数来组成操作时。对专家的依赖使得无法将稳健性验证适用于启动功能最新技术的发展,而且缺乏紧凑性保证可能给特定模型带来虚假的不安全感。根据我们的最佳知识,我们首先考虑自动计算任意的N维度激活功能的紧紧紧线性界限的问题。我们提议LinSyn是第一个在任意启动功能上实现紧紧凑性界限的方法,而只是利用激活功能本身的数学定义。我们的方法利用一种高效的超乎逻辑的方法来合成紧凑和通常健全的约束,然后核查稳妥性(必要时调整最终约束性),然后用高度优化的S-MT和S-MT(S-MT)的S-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-r-