We prove a few representer theorems for a localised version of the regularised and multiview support vector machine learning problem introduced by H.Q.~Minh, L.~Bazzani, and V.~Murino, \textit{Journal of Machine Learning Research}, \textbf{17}(2016) 1--72, that involves operator valued positive semidefinite kernels and their reproducing kernel Hilbert spaces. The results concern general cases when convex or nonconvex loss functions and finite or infinite dimensional input spaces are considered. We show that the general framework allows infinite dimensional input spaces and nonconvex loss functions for some special cases, in particular in case the loss functions are G\^ateaux differentiable. Detailed calculations are provided for the exponential least squares loss functions that leads to partially nonlinear problems.
翻译:我们证明了一个本地化版本的支持向量机学习问题的表示定理。这个问题是由H.Q. Minh,L. Bazzani和V. Murino在Journal of Machine Learning Research (2016年17期) 1-72中引入的,涉及到算子值正半定核和它们的再生核希尔伯特空间。这些结果涉及到凸或非凸损失函数和有限或无限维输入空间的一般情况。我们证明了对于一些特殊情况,如当损失函数是Gâteaux可微时,一般框架允许无限维输入空间和非凸损失函数。我们对导致部分非线性问题的指数最小二乘损失函数提供了详细的计算。