In this paper, we discuss application of iterative Stochastic Optimization routines to the problem of sparse signal recovery from noisy observation. Using Stochastic Mirror Descent algorithm as a building block, we develop a multistage procedure for recovery of sparse solutions to Stochastic Optimization problem under assumption of smoothness and quadratic minoration on the expected objective. An interesting feature of the proposed algorithm is linear convergence of the approximate solution during the preliminary phase of the routine when the component of stochastic error in the gradient observation which is due to bad initial approximation of the optimal solution is larger than the "ideal" asymptotic error component owing to observation noise "at the optimal solution." We also show how one can straightforwardly enhance reliability of the corresponding solution by using Median-of-Means like techniques. We illustrate the performance of the proposed algorithms in application to classical problems of recovery of sparse and low rank signals in linear regression framework. We show, under rather weak assumption on the regressor and noise distributions, how they lead to parameter estimates which obey (up to factors which are logarithmic in problem dimension and confidence level) the best known to us accuracy bounds.
翻译:在本文中,我们讨论了迭代Stochastic优化常规应用对于从噪音观测中微弱恢复信号的问题。用Stochastic镜源算法作为构件,我们开发了一个多阶段程序,在假定平稳和对预期目标的四面形微小假设下,恢复对Stochastic优化问题的微弱解决方案。提议的算法的一个有趣的特点是,在平滑和对预期目标的四面形微小的假设下,在平滑观察的初始阶段,由于对最佳解决方案最初偏差部分的误差部分的最初误差大于 " 理想的 " 无症状误差部分的 " 。我们还展示了如何通过使用像技术一样的Memedia-means这样的直接提高相应解决方案的可靠性。我们展示了拟议的算法在应用于线性回归框架中稀疏和低级信号的典型问题时的表现。在对递减和噪音分布的假设相当薄弱的情况下,我们展示了它们如何导致符合(在问题层面和信任水平上具有对论性的因素)的参数估计。我们所知道的最佳精确度。