High-dimensional linear regression model is the most popular statistical model for high-dimensional data, but it is quite a challenging task to achieve a sparse set of regression coefficients. In this paper, we propose a simple heuristic algorithm to construct sparse high-dimensional linear regression models, which is adapted from the shortest solution-guided decimation algorithm and is referred to as ASSD. This algorithm constructs the support of regression coefficients under the guidance of the least-squares solution of the recursively decimated linear equations, and it applies an early-stopping criterion and a second-stage thresholding procedure to refine this support. Our extensive numerical results demonstrate that ASSD outperforms LASSO, vector approximate message passing, and two other representative greedy algorithms in solution accuracy and robustness. ASSD is especially suitable for linear regression problems with highly correlated measurement matrices encountered in real-world applications.
翻译:高维线性回归模型是高维数据最受欢迎的统计模型,但实现一套稀少的回归系数是一项相当艰巨的任务。 在本文中,我们提出一个简单的超值算法,以构建稀有的高维线性回归模型,该算法从最短的解决方案引导毁灭算法改名为ASSD。该算法构建了在循环毁灭线性直线方方方平方平方平方平方平方平面解决方案指导下的回归系数支持值,并应用早期停止标准和第二阶段阈值程序来完善这一支持。我们的广泛数字结果显示,ASSD在解决方案准确性和稳健性方面超过了LASSO、矢量信息传递和另外两个具有代表性的贪婪算法。ASD特别适合在现实世界应用中遇到的高度关联的测量矩阵的线性回归问题。