The local Lipschitz constant of a neural network is a useful metric with applications in robustness, generalization, and fairness evaluation. We provide novel analytic results relating the local Lipschitz constant of nonsmooth vector-valued functions to a maximization over the norm of the generalized Jacobian. We present a sufficient condition for which backpropagation always returns an element of the generalized Jacobian, and reframe the problem over this broad class of functions. We show strong inapproximability results for estimating Lipschitz constants of ReLU networks, and then formulate an algorithm to compute these quantities exactly. We leverage this algorithm to evaluate the tightness of competing Lipschitz estimators and the effects of regularized training on the Lipschitz constant.
翻译:神经网络的局部Lipschitz常量是一种有用的衡量标准,可以应用到稳健性、普遍性和公平性评估。我们提供了与当地Lipschitz的非移动矢量值函数的本地Lipschitz常量有关的新的分析结果,以达到比通用的Jacobian标准最大化。我们提出了一个充分的条件,使反向调整总是可以返回普遍Jacobian的元素,并重新界定这一广泛功能类别的问题。我们显示出强烈的不协调结果,可以估算Ribschitz网络的Lipschitz常量,然后制定精确计算这些数量的算法。我们利用这一算法来评估相互竞争的Lipschitz估计器的紧凑性以及定期培训对Lipschitz常量的影响。