Kernel ridge regression is well-known to achieve minimax optimal rates in low-dimensional settings. However, its behavior in high dimensions is much less understood. Recent work establishes consistency for kernel regression under certain assumptions on the ground truth function and the distribution of the input data. In this paper, we show that the rotational invariance property of commonly studied kernels (such as RBF, inner product kernels and fully-connected NTK of any depth) induces a bias towards low-degree polynomials in high dimensions. Our result implies a lower bound on the generalization error for a wide range of distributions and various choices of the scaling for kernels with different eigenvalue decays. This lower bound suggests that general consistency results for kernel ridge regression in high dimensions require a more refined analysis that depends on the structure of the kernel beyond its eigenvalue decay.
翻译:内核脊回归在低维环境中达到最小最大最佳率是众所周知的。 但是,它在高维方面的行为远不为人所知。 最近的工作在地面真相函数和输入数据分布的某些假设下为内核回归确定了一致性。 在本文中,我们表明,常见研究的内核(如RBF、内产品内核和任何深度的完全连接的NTK)的旋转性差属性导致偏向于高维度低度多面体。 我们的结果意味着对广泛分布的通用错误和不同电子元值衰变的内核缩的各种选择的概括性差限制较低。 这一较低范围表明,要对高维度内核脊回归的总体一致性结果进行更精确的分析,这取决于内核的构造,而超出其内核值衰变。