We analyze the Nystr\"om approximation of a positive definite kernel associated with a probability measure. We first prove an improved error bound for the conventional Nystr\"om approximation with i.i.d. sampling and singular-value decomposition in the continuous regime; the proof techniques are borrowed from statistical learning theory. We further introduce a refined selection of subspaces in Nystr\"om approximation with theoretical guarantees that is applicable to non-i.i.d. landmark points. Finally, we discuss their application to convex kernel quadrature and give novel theoretical guarantees as well as numerical observations.
翻译:我们分析与概率测量相关的正确定内核的 Nystr\'om 近似值。 我们首先证明常规 Nystr\'om 近似值有一个更好的错误。 我们首先证明了一个更好的错误: 常规Nystr\'om 近近似值与 i. d. 采样和单值分解在连续的系统里; 验证技术是从统计学习理论中借用的。 我们还引入了一种精确的 Nystr\'om 近近似值子空间选择, 并引入了适用于非i. i. d. 里程碑点的理论保证。 最后, 我们讨论了这些子空间在 convex 内核二次构造中的应用, 并提供新的理论保证和数字观察。