In sparse linear regression, the SLOPE estimator generalizes LASSO by penalizing different coordinates of the estimate according to their magnitudes. In this paper, we present a precise performance characterization of SLOPE in the asymptotic regime where the number of unknown parameters grows in proportion to the number of observations. Our asymptotic characterization enables us to derive the fundamental limits of SLOPE in both estimation and variable selection settings. We also provide a computational feasible way to optimally design the regularizing sequences such that the fundamental limits are reached. In both settings, we show that the optimal design problem can be formulated as certain infinite-dimensional convex optimization problems, which have efficient and accurate finite-dimensional approximations. Numerical simulations verify all our asymptotic predictions. They demonstrate the superiority of our optimal regularizing sequences over other designs used in the existing literature.
翻译:在细微的线性回归中, SLOPE 估计值将LASSO 概括为 LASSO, 依据其大小对估算的不同坐标进行处罚。 在本文中, 我们展示了无药可依的系统对 SLOPE 的精确性能特征描述, 其中未知参数的数量与观测量成比例增长。 我们的无药可依的特性描述使我们能够在估计和变量选择设置中得出 SLOPE 的基本界限。 我们还提供了一种可行的计算方法, 优化地设计常规化序列, 以便达到基本界限 。 在这两种环境中, 我们显示最佳的设计问题可以被描述为某些具有高效和准确的有限维度近光的无限维优化问题。 数字模拟可以验证我们所有的无药性预测。 它们显示了我们优化的常规化序列优于现有文献中使用的其他设计 。