Major progress has been made in the previous decade to characterize the asymptotic behavior of regularized M-estimators in high-dimensional regression problems in the proportional asymptotic regime where the sample size $n$ and the number of features $p$ are increasing simultaneously such that $n/p\to \delta \in(0,\infty)$, using powerful tools such as Approximate Message Passing or the Convex Gaussian Min-Max Theorem (CGMT). The asymptotic error and behavior of the regularized M-estimator is then typically described by a system of nonlinear equations with a few scalar unknowns, and the solution to this system precisely characterize the asymptotic error. Application of the CGMT and related machinery requires the existence of a solution to this low-dimensional system of equations. This paper resolves the question of existence of solution to this low-dimensional system for the case of linear models with independent additive noise, when both the data-fitting loss function and regularization penalty are separable and convex. Such existence result for solution to the nonlinear system were previously known under strong convexity for specific estimators such as the Lasso. The main idea behind this existence result is inspired by an argument developed \cite{montanari2019generalization,celentano2020lasso} in different contexts: By constructing an ad-hoc convex minimization problem in an infinite dimensional Hilbert space, the existence of the Lagrange multiplier for this optimization problem makes it possible to construct explicitly solutions to the low-dimensional system of interest. The conditions under which we derive this existence result exactly correspond to the side of the phase transition where perfect recovery $\hat x= x_0$ fails, so that these conditions are optimal.
翻译:暂无翻译