In this work we investigate the variation of the online kernelized ridge regression algorithm in the setting of $d-$dimensional adversarial nonparametric regression. We derive the regret upper bounds on the classes of Sobolev spaces $W_{p}^{\beta}(\mathcal{X})$, $p\geq 2, \beta>\frac{d}{p}$. The upper bounds are supported by the minimax regret analysis, which reveals that in the cases $\beta> \frac{d}{2}$ or $p=\infty$ these rates are (essentially) optimal. Finally, we compare the performance of the kernelized ridge regression forecaster to the known non-parametric forecasters in terms of the regret rates and their computational complexity as well as to the excess risk rates in the setting of statistical (i.i.d.) nonparametric regression.
翻译:在此工作中,我们调查了在设定 $-d-d-d-d- ⁇ 2} 或 $-p-infty$ 情况下在线内心脊回归算法的变异性。 我们从 Sobolev 空间的等级 $W ⁇ p ⁇ ⁇ beta} (\mathcal{X}}$p\geq 2,\beta ⁇ fra{d ⁇ p} 美元中得出了遗憾的上界值。 上界值得到了迷你马克的遗憾分析的支持, 该分析显示, $\beta>\ frac{d- } 或$p- infty$ 这些比率是( 本质上) 最佳的。 最后, 我们将内心脊回归预测器的性能与已知的非参数预测器在遗憾率和计算复杂性方面以及统计( i.d.) 非参数回归中的超风险率进行比较。