Major progress has been made in the previous decade to characterize the asymptotic behavior of regularized M-estimators in high-dimensional regression problems in the proportional asymptotic regime where the sample size $n$ and the number of features $p$ are increasing simultaneously such that $n/p\to \delta \in(0,\infty)$, using powerful tools such as Approximate Message Passing or the Convex Gaussian Min-Max Theorem (CGMT). The asymptotic error and behavior of the regularized M-estimator is then typically described by a system of nonlinear equations with a few scalar unknowns, and the solution to this system precisely characterizes the asymptotic error. Application of the CGMT and related machinery requires the existence and uniqueness of a solution to this low-dimensional system of equations or to a related scalar convex minimization problem. This paper resolves the question of existence and uniqueness of solution to this low-dimensional system for the case of linear models with independent additive noise, when both the data-fitting loss function and regularizer are separable and convex. Such existence result was previously known under strong convexity or for specific estimators such as the Lasso. The main idea behind this existence result is inspired by an argument developed by Montanari et al. [2023] and Celentano et al. [2023] in different contexts: By constructing an ad-hoc convex minimization problem in an infinite dimensional Hilbert space, the existence of the Lagrange multiplier for this optimization problem makes it possible to construct explicit solutions to the low-dimensional system of interest. The conditions under which we derive this existence result exactly correspond to the side of the phase transition where perfect recovery $\hat{x} = x_0$ fails, so that these conditions are optimal.
翻译:暂无翻译