The Lipschitz constant of neural networks plays an important role in several contexts of deep learning ranging from robustness certification and regularization to stability analysis of systems with neural network controllers. Obtaining tight bounds of the Lipschitz constant is therefore important. We introduce LipBaB, a branch and bound framework to compute certified bounds of the local Lipschitz constant of deep neural networks with ReLU activation functions up to any desired precision. We achieve this by bounding the norm of the Jacobians, corresponding to different activation patterns of the network caused within the input domain. Our algorithm can provide provably exact computation of the Lipschitz constant for any p-norm.
翻译:Lipschitz神经网络常数在从稳健度认证和正规化到神经网络控制器系统的稳定分析等若干深层学习背景下发挥着重要作用。因此,获得Lipschitz常数的严格界限非常重要。我们引入了LipBAB(一个分支和约束框架)来计算当地Lipschitz深神经网络常数的经认证的界限,并按任何需要的精确度启动RELU功能。我们通过将Jacobian人的规范与输入域内产生的网络的不同激活模式相匹配来做到这一点。我们的算法可以为任何 p-norm 提供精确的利普施茨常数计算。