We consider interpolation learning in high-dimensional linear regression with Gaussian data, and prove a generic uniform convergence guarantee on the generalization error of interpolators in an arbitrary hypothesis class in terms of the class's Gaussian width. Applying the generic bound to Euclidean norm balls recovers the consistency result of Bartlett et al. (2020) for minimum-norm interpolators, and confirms a prediction of Zhou et al. (2020) for near-minimal-norm interpolators in the special case of Gaussian data. We demonstrate the generality of the bound by applying it to the simplex, obtaining a novel consistency result for minimum l1-norm interpolators (basis pursuit). Our results show how norm-based generalization bounds can explain and be used to analyze benign overfitting, at least in some settings.
翻译:我们考虑用高斯数据在高斯数据进行高维线性回归的高维线性回归中进行跨度学习,并证明在高斯数据这一任意假设类中,对跨度误差的通用统一统一保障。 应用对欧几里得规范球的通用约束,恢复了Bartlett等人(2020年)对最低北纬干涉器的一致性结果,并确认了Zhou等人(2020年)对高斯数据特例中接近最低北纬的跨度者的预测。 我们通过将其应用到简单x上,为最低一级北纬干涉器获得新的一致性结果(追求基调 ), 显示了约束的普遍性。 我们的结果显示,基于规范的通用界限可以如何解释和用于分析良性过度,至少在某些环境下是如此。