This paper generalizes regularized regression problems in a hyper-reproducing kernel Hilbert space (hyper-RKHS), illustrates its utility for kernel learning and out-of-sample extensions, and proves asymptotic convergence results for the introduced regression models in an approximation theory view. Algorithmically, we consider two regularized regression models with bivariate forms in this space, including kernel ridge regression (KRR) and support vector regression (SVR) endowed with hyper-RKHS, and further combine divide-and-conquer with Nystr\"{o}m approximation for scalability in large sample cases. This framework is general: the underlying kernel is learned from a broad class, and can be positive definite or not, which adapts to various requirements in kernel learning. Theoretically, we study the convergence behavior of regularized regression algorithms in hyper-RKHS and derive the learning rates, which goes beyond the classical analysis on RKHS due to the non-trivial independence of pairwise samples and the characterisation of hyper-RKHS. Experimentally, results on several benchmarks suggest that the employed framework is able to learn a general kernel function form an arbitrary similarity matrix, and thus achieves a satisfactory performance on classification tasks.
翻译:本文概括了超复制的内核希尔伯特空间( Hilbert ) 中的常规回归问题,说明了其用于内核学习和外侧扩展的实用性,并用近似理论观点证明了引入的回归模型的无现成趋同结果。 比较而言,我们考虑在这个空间中两种具有两变形式的常规回归模式,包括内核脊脊下回归(KRR),支持具有超RKHS的矢量回归(SVR),并进一步将分裂和征服与Nystr\"{o}近似(Nystr\"{o}近似(Nystr\})对大样本的可伸缩性结合起来。 这个框架是一般性的:基本内核是从一个广泛的类中学习的,可以肯定或不肯定的,可以适应内核学习的各种要求。 从理论上讲,我们研究超RKHS的正规回归算法的趋同行为,并得出学习率,这超出了对RKHS的经典分析,因为对齐样品的非三立独立性和超正核心HS的特性化。 实验性结果显示,因此,在一系列的矩阵上采用了一种可任意性的标准。