We compute precise asymptotic expressions for the learning curves of least squares random feature (RF) models with either a separable strongly convex regularization or the $\ell_1$ regularization. We propose a novel multi-level application of the convex Gaussian min max theorem (CGMT) to overcome the traditional difficulty of finding computable expressions for random features models with correlated data. Our result takes the form of a computable 4-dimensional scalar optimization. In contrast to previous results, our approach does not require solving an often intractable proximal operator, which scales with the number of model parameters. Furthermore, we extend the universality results for the training and generalization errors for RF models to $\ell_1$ regularization. In particular, we demonstrate that under mild conditions, random feature models with elastic net or $\ell_1$ regularization are asymptotically equivalent to a surrogate Gaussian model with the same first and second moments. We numerically demonstrate the predictive capacity of our results, and show experimentally that the predicted test error is accurate even in the non-asymptotic regime.
翻译:我们为最小方位随机特性(RF) 模型的学习曲线计算精确的空格表达式。 我们用一个可分解的强烈正方形规范化或$\ell_1美元规范化的模型来计算最小方位随机特性(RF) 模型的学习曲线。 我们提出一个新的多层次应用 convex Gaussian min mer ororem (CGMT), 以克服用相关数据查找随机特征模型的可计算表达式的传统困难。 我们的结果采取可计算四维缩放优化的形式。 与以往的结果不同, 我们的方法并不要求解决一个经常难以处理的近端操作器, 以模型参数数量为尺度。 此外, 我们将RF模型培训和一般化错误的普遍性结果扩大到$\ell_1美元规范化。 特别是, 我们证明在温和的条件下, 带弹性网或 $\ell_1美元的随机特征模型, 与以相同时间和第二个时刻的超度模型等同。 我们用数字方式展示了我们结果的预测能力, 并实验性地显示预测测试错误在非制度中是准确的。</s>