We propose and study kernel conjugate gradient methods (KCGM) with random projections for least-squares regression over a separable Hilbert space. Considering two types of random projections generated by randomized sketches and Nystr\"{o}m subsampling, we prove optimal statistical results with respect to variants of norms for the algorithms under a suitable stopping rule. Particularly, our results show that if the projection dimension is proportional to the effective dimension of the problem, KCGM with randomized sketches can generalize optimally, while achieving a computational advantage. As a corollary, we derive optimal rates for classic KCGM in the well-conditioned regimes for the case that the target function may not be in the hypothesis space.
翻译:我们提出并研究内核共振梯度方法(KCGM),对可分离的希尔伯特空间上最小方形回归进行随机预测。考虑到随机草图和Nystr\"{o}m子抽样产生的两种随机预测,我们证明在适当停止规则下算法规范的变量方面,统计结果最佳。 特别是,我们的结果显示,如果预测维度与问题的有效维度成正比,带有随机草图的KCGM可以最优化地概括,同时实现计算优势。 作为必然结果,我们在条件完善的制度中为典型的KCGM得出最佳比率,以防目标功能可能不在假设空间。