Abstracting neural networks with constraints they impose on their inputs and outputs can be very useful in the analysis of neural network classifiers and to derive optimization-based algorithms for certification of stability and robustness of feedback systems involving neural networks. In this paper, we propose a convex program, in the form of a Linear Matrix Inequality (LMI), to certify incremental quadratic constraints on the map of neural networks over a region of interest. These certificates can capture several useful properties such as (local) Lipschitz continuity, one-sided Lipschitz continuity, invertibility, and contraction. We illustrate the utility of our approach in two different settings. First, we develop a semidefinite program to compute guaranteed and sharp upper bounds on the local Lipschitz constant of neural networks and illustrate the results on random networks as well as networks trained on MNIST. Second, we consider a linear time-invariant system in feedback with an approximate model predictive controller parameterized by a neural network. We then turn the stability analysis into a semidefinite feasibility program and estimate an ellipsoidal invariant set for the closed-loop system.
翻译:抽象神经网络对输入和产出造成限制,因此抽象的神经网络在分析神经网络分类器和获得最优化的算法以证明神经网络的稳定性和稳健性方面非常有用。在本文件中,我们提议了一个以线性矩阵不平等(LMI)为形式的螺旋程序,以证明一个感兴趣的区域神经网络图上神经网络图上的递增四变限制。这些证书可以捕捉若干有用的属性,如(当地)利普西茨连续、单向利普西茨连续、可视性以及收缩。我们展示了我们在两种不同环境中的方法的效用。首先,我们开发了一个半定型程序,对当地利普西茨神经网络常数进行有保障和尖锐的上界进行编译,并展示随机网络以及经过MNIST培训的网络的结果。第二,我们考虑一个线性时变系统,以反馈一个近似模型的预测控制器参数,由神经网络进行。我们随后将稳定性分析变成一个半定性可行性程序,并估计封闭式系统的静态定置装置。