We study the generalization error of functions that interpolate prescribed data points and are selected by minimizing a weighted norm. Under natural and general conditions, we prove that both the interpolants and their generalization errors converge as the number of parameters grow, and the limiting interpolant belongs to a reproducing kernel Hilbert space. This rigorously establishes an implicit bias of minimum weighted norm interpolation and explains why norm minimization may either benefit or suffer from over-parameterization. As special cases of this theory, we study interpolation by trigonometric polynomials and spherical harmonics. Our approach is from a deterministic and approximation theory viewpoint, as opposed to a statistical or random matrix one.
翻译:在自然和一般条件下,我们证明,随着参数数量的增加,内插者及其通用错误会趋同,限制内插者属于复制的内核Hilbert空间。这严格地确立了最低加权规范内插的隐含偏差,并解释了为什么规范的最小化可能获益于或受害于过分的参数化。作为这一理论的特殊情况,我们通过三角测量多元体和球体调理学来研究内插。我们的方法来自确定性和近似理论观点,而不是统计或随机矩阵观点。