Lipschitz-constrained neural networks have several advantages compared to unconstrained ones and can be applied to various different problems. Consequently, they have recently attracted considerable attention in the deep learning community. Unfortunately, it has been shown both theoretically and empirically that networks with ReLU activation functions perform poorly under such constraints. On the contrary, neural networks with learnable 1-Lipschitz linear splines are known to be more expressive in theory. In this paper, we show that such networks are solutions of a functional optimization problem with second-order total-variation regularization. Further, we propose an efficient method to train such 1-Lipschitz deep spline neural networks. Our numerical experiments for a variety of tasks show that our trained networks match or outperform networks with activation functions specifically tailored towards Lipschitz-constrained architectures.
翻译:与不受约束的神经网络相比,Lipschitz受限制的神经网络具有若干优势,可以应用于不同的问题。因此,它们最近引起了深层学习界的极大关注。不幸的是,从理论上和从经验上都可以看出,具有RELU激活功能的网络在这种制约下运作不良。相反,已知具有1-Lipschitz线性线条的神经网络在理论上更能表达。在本文中,我们表明,这种网络是二阶全变换正规化的功能优化问题的解决办法。此外,我们提出了一种有效的方法来培训这种1-Lipschitz深螺旋型神经网络。我们对各种任务进行的数字实验表明,我们受过训练的网络与专门针对Lipschitz受限制的结构的激活功能匹配或超过网络。