Shape constraints, such as non-negativity, monotonicity, convexity or supermodularity, play a key role in various applications of machine learning and statistics. However, incorporating this side information into predictive models in a hard way (for example at all points of an interval) for rich function classes is a notoriously challenging problem. We propose a unified and modular convex optimization framework, relying on second-order cone (SOC) tightening, to encode hard affine SDP constraints on function derivatives, for models belonging to vector-valued reproducing kernel Hilbert spaces (vRKHSs). The modular nature of the proposed approach allows to simultaneously handle multiple shape constraints, and to tighten an infinite number of constraints into finitely many. We prove the convergence of the proposed scheme and that of its adaptive variant, leveraging geometric properties of vRKHSs. Due to the covering-based construction of the tightening, the method is particularly well-suited to tasks with small to moderate input dimensions. The efficiency of the approach is illustrated in the context of shape optimization, safety-critical control, robotics and econometrics.
翻译:在机器学习和统计的各种应用中,将这种侧面信息硬地(例如在间隔期间的每个点)纳入丰富功能类的预测模型是一个臭名昭著的挑战问题。我们提议一个统一的模块式组合优化框架,依靠二级锥体(SOC)的收紧,对矢量估值的再生产核心Hilbert空间(VRKHSs)的模型的功能衍生物的硬缝SDP限制进行编码。拟议方法的模块性质允许同时处理多种形状限制,并将无限数量的限制收紧到有限的多个。我们证明拟议办法及其适应变体的趋同,利用VRKHS的几何特性。由于紧缩的覆盖性构造,该方法特别适合小到中度投入层面的任务。该方法的效率在形状优化、安全临界控制、机器人和计量经济学方面得到了说明。