We propose a class of greedy algorithms for weighted sparse recovery by considering new loss function-based generalizations of Orthogonal Matching Pursuit (OMP). Given a (regularized) loss function, the proposed algorithms alternate the iterative construction of the signal support via greedy index selection and a signal update based on solving a local data-fitting problem restricted to the current support. We show that greedy selection rules associated with popular weighted sparsity-promoting loss functions admit explicitly computable and simple formulas. Specifically, we consider $ \ell^0 $- and $ \ell^1 $-based versions of the weighted LASSO (Least Absolute Shrinkage and Selection Operator), the Square-Root LASSO (SR-LASSO) and the Least Absolute Deviations LASSO (LAD-LASSO). Through numerical experiments on Gaussian compressive sensing and high-dimensional function approximation, we demonstrate the effectiveness of the proposed algorithms and empirically show that they inherit desirable characteristics from the corresponding loss functions, such as SR-LASSO's noise-blind optimal parameter tuning and LAD-LASSO's fault tolerance. In doing so, our study sheds new light on the connection between greedy sparse recovery and convex relaxation.
翻译:我们通过考虑基于损失功能的新的Orthogonal 匹配追逐(OMP)一般化(OMP)功能,为加权稀疏恢复建议一种贪婪的算法。考虑到(常规)损失功能,提议的算法可以替代通过贪婪指数选择和基于解决限于当前支持的当地数据适应问题的信号更新,通过贪婪指数选择和基于解决限于当前支持的本地数据适应问题的信号支持迭代构建信号支持。我们表明,与广度加权促进损失功能相联系的贪婪选择规则可以明确和简单的计算公式。具体地说,我们认为,从相应的损失功能中,例如,SAR-LASSO(东部绝对缩小和选择操作者)、Sqal-ROot LASSO(SR-LASSO)和LASSSO(LAD-LASSO)的最小绝对偏离值更新。我们通过对高压压缩感应变法和高维功能的数值实验,展示了拟议算法的有效性,从相应的损失功能中,例如SR-LASSO(东部绝对缩小和选择操作)最佳参数调整和耐久的拉松(LADASS)的升级研究。</s>