Shape constraints such as positive semi-definiteness (PSD) for matrices or convexity for functions play a central role in many applications in machine learning and sciences, including metric learning, optimal transport, and economics. Yet, very few function models exist that enforce PSD-ness or convexity with good empirical performance and theoretical guarantees. In this paper, we introduce a kernel sum-of-squares model for functions that take values in the PSD cone, which extends kernel sums-of-squares models that were recently proposed to encode non-negative scalar functions. We provide a representer theorem for this class of PSD functions, show that it constitutes a universal approximator of PSD functions, and derive eigenvalue bounds in the case of subsampled equality constraints. We then apply our results to modeling convex functions, by enforcing a kernel sum-of-squares representation of their Hessian, and show that any smooth and strongly convex function may be thus represented. Finally, we illustrate our methods on a PSD matrix-valued regression task, and on scalar-valued convex regression.
翻译:然而,很少有功能模型能够以良好的实证绩效和理论保障来强制实行私营部门司的特性或共性,而这种模型则具有良好的实证性,在理论保障方面,我们在本文件中为私营部门司内具有价值的功能引入了一个内核总和模型,该模型扩展了最近提议用于编码非内性标量功能的内核总和模型。我们为私营部门司的这一类职能提供了一种代表性的理论,表明它构成了私营部门司职能的普遍近似体,并产生了二次抽样值的界限。我们随后将我们的结果应用于对等轴功能进行建模,方法是实施内核总和海森的表示,并表明任何光滑和强烈的共性功能都可以在SCPAL上得到体现。最后,我们展示了我们关于私营部门司内值的回归分析方法。