We introduce Fiedler regularization, a novel approach for regularizing neural networks that utilizes spectral/graphical information. Existing regularization methods often focus on penalizing weights in a global/uniform manner that ignores the connectivity structure of the neural network. We propose to use the Fiedler value of the neural network's underlying graph as a tool for regularization. We provide theoretical motivation for this approach via spectral graph theory. We demonstrate several useful properties of the Fiedler value that make it useful as a regularization tool. We provide an approximate, variational approach for faster computation during training. We provide an alternative formulation of this framework in the form of a structurally weighted $\text{L}_1$ penalty, thus linking our approach to sparsity induction. We provide uniform generalization error bounds for Fiedler regularization via a Rademacher complexity analysis. We performed experiments on datasets that compare Fiedler regularization with classical regularization methods such as dropout and weight decay. Results demonstrate the efficacy of Fiedler regularization. This is a journal extension of the conference paper by Tam and Dunson (2020).
翻译:我们引入了Fiedler正则化,一种利用谱/图形信息进行神经网络正则化的新方法。现有的正则化方法通常集中于以全局/统一的方式惩罚权重,忽略了神经网络的连接结构。我们提出使用神经网络底层图的Fiedler值作为正则化工具。通过谱图理论,我们为这种方法提供了理论动机。我们展示了Fiedler值的几个有用属性,使其成为正则化工具。我们提供了一个近似的,变分法快速计算的方法。我们提供了这个框架的另一种形式,以结构加权的形式呈现$\text{L}_1$惩罚,从而将我们的方法与稀疏诱导相联系。我们通过Rademacher复杂度分析为Fiedler正则化提供均匀的泛化误差界限。我们在比较Fiedler正则化与像dropout和权重衰减这样的经典正则化方法的数据集上进行了实验。结果证明了Fiedler正则化的有效性。这是Tam and Dunson(2020)会议论文的期刊扩展部分。