This article studies basis pursuit, i.e. minimum $\ell_1$-norm interpolation, in sparse linear regression with additive errors. No conditions on the errors are imposed. It is assumed that the number of i.i.d. Gaussian features grows superlinear in the number of samples. The main result is that under these conditions the Euclidean error of recovering the true regressor is of the order of the average noise level. Hence, the regressor recovered by basis pursuit is close to the truth if the average noise level is small. Lower bounds that show near optimality of the results complement the analysis. In addition, these results are extended to low rank trace regression. The proofs rely on new lower tail bounds for maxima of Gaussians vectors and the spectral norm of Gaussian matrices, respectively, and might be of independent interest as they are significantly stronger than the corresponding upper tail bounds.
翻译:文章研究基于追踪, 即最小值$@ ell_ 1$- 北纬线性回归, 以微小的线性回归方式, 加上添加错误。 没有设定错误的条件 。 假设i. i. d. Gaussian 特征的数量会增加样本数量的超线性。 主要结果是, 在这种条件下, Euclidean 恢复真实回归器的错误是平均噪声水平的顺序。 因此, 如果平均噪声水平小, 基础追击所恢复的回归器接近于真相 。 显示结果接近最佳性的下限可以补充分析 。 此外, 这些结果被扩展为低级跟踪回归 。 证据分别依靠新的低尾线作为高山矢量和高山矩阵的光谱规范, 可能具有独立的兴趣, 因为它们比相应的上尾圈要强得多 。