In this paper, we study regression problems over a separable Hilbert space with the square loss, covering non-parametric regression over a reproducing kernel Hilbert space. We investigate a class of spectral/regularized algorithms, including ridge regression, principal component regression, and gradient methods. We prove optimal, high-probability convergence results in terms of variants of norms for the studied algorithms, considering a capacity assumption on the hypothesis space and a general source condition on the target function. Consequently, we obtain almost sure convergence results with optimal rates. Our results improve and generalize previous results, filling a theoretical gap for the non-attainable cases.
翻译:在本文中,我们研究了一个与平方损失相分离的希尔伯特空间的回归问题,涵盖了在复制的赫伯特核心空间上的非参数回归问题;我们研究了一组光谱/正规算法,包括山脊回归法、主要组成部分回归法和梯度法;我们证明,从研究的算法规范的变种来看,我们最理想、高度概率趋同的结果是:考虑到假设空间的能力假设,以及目标功能的一般源条件;因此,我们几乎可以以最佳速率获得趋同结果。我们的结果改进并概括了先前的结果,填补了无法实现案例的理论空白。