In this paper, we study the asymptotic properties of regularized least squares with indefinite kernels in reproducing kernel Krein spaces (RKKS). By introducing a bounded hyper-sphere constraint to such non-convex regularized risk minimization problem, we theoretically demonstrate that this problem has a globally optimal solution with a closed form on the sphere, which makes approximation analysis feasible in RKKS. Regarding to the original regularizer induced by the indefinite inner product, we modify traditional error decomposition techniques, prove convergence results for the introduced hypothesis error based on matrix perturbation theory, and derive learning rates of such regularized regression problem in RKKS. Under some conditions, the derived learning rates in RKKS are the same as that in reproducing kernel Hilbert spaces (RKHS), which is actually the first work on approximation analysis of regularized learning algorithms in RKKS.
翻译:在本文中,我们研究了在复制内核内核空间(RKKS)的过程中,固定的最小正方形与无固定内核内核的无时效内核的无时效性特性。 我们从理论上说,通过对此类非凝固的常规风险最小化问题引入约束性超视距限制,我们从理论上表明,这一问题具有一种全球最佳的解决办法,以封闭的形式解决,使得在RKKS中进行近似分析成为可行。 关于由无限期内产物引起的原始常规调节器,我们修改了传统的错误分解技术,证明了基于矩阵干扰理论引入的假设错误的趋同结果,并得出了RKKKS中此类常规回归问题的学习率。 在某些条件下,RKKS中得出的学习率与再生产内核内核Hilbert空间(RKHS)一样,这实际上是对RKKS中常规学习算法进行近似分析的首项工作。