Learning monotonic models with respect to a subset of the inputs is a desirable feature to effectively address the fairness, interpretability, and generalization issues in practice. Existing methods for learning monotonic neural networks either require specifically designed model structures to ensure monotonicity, which can be too restrictive/complicated, or enforce monotonicity by adjusting the learning process, which cannot provably guarantee the learned model is monotonic on selected features. In this work, we propose to certify the monotonicity of the general piece-wise linear neural networks by solving a mixed integer linear programming problem.This provides a new general approach for learning monotonic neural networks with arbitrary model structures. Our method allows us to train neural networks with heuristic monotonicity regularizations, and we can gradually increase the regularization magnitude until the learned network is certified monotonic. Compared to prior works, our approach does not require human-designed constraints on the weight space and also yields more accurate approximation. Empirical studies on various datasets demonstrate the efficiency of our approach over the state-of-the-art methods, such as Deep Lattice Networks.
翻译:用于学习单一神经网络的现有方法要么需要专门设计的模型结构,以确保单一神经网络的单一性,因为单一神经网络的限制性/复杂性太强,要么通过调整学习过程来强制实行单一性,这无法令人理解地保证所学模型对选定特征的单一性。在这项工作中,我们提议通过解决混合整数线性编程问题来验证普通片点线性线性神经网络的单一性。这为学习带有任意型结构的单体神经网络提供了新的通用方法。我们的方法允许我们用超度单一性规范来培训神经网络,我们可以逐渐增加正规化程度,直到所学网络得到认证。与以前的工作相比,我们的方法并不要求人为地限制重量空间,也得出更准确的近似值。关于各种数据集的真知灼见性研究展示了我们采用的最先进方法的效率,例如深拉蒂网络。