This paper investigates total variation minimization in one spatial dimension for the recovery of gradient-sparse signals from undersampled Gaussian measurements. Recently established bounds for the required sampling rate state that uniform recovery of all $s$-gradient-sparse signals in $\mathbb{R}^n$ is only possible with $m \gtrsim \sqrt{s n} \cdot \text{PolyLog}(n)$ measurements. Such a condition is especially prohibitive for high-dimensional problems, where $s$ is much smaller than $n$. However, previous empirical findings seem to indicate that this sampling rate does not reflect the typical behavior of total variation minimization. The present work provides a rigorous analysis that breaks the $\sqrt{s n}$-bottleneck for a large class of "natural" signals. The main result shows that non-uniform recovery succeeds with high probability for $m \gtrsim s \cdot \text{PolyLog}(n)$ measurements if the jump discontinuities of the signal vector are sufficiently well separated. In particular, this guarantee allows for signals arising from a discretization of piecewise constant functions defined on an interval. The key ingredient of the proof is a novel upper bound for the associated conic Gaussian mean width, which is based on a signal-dependent, non-dyadic Haar wavelet transform. Furthermore, a natural extension to stable and robust recovery is addressed.
翻译:本文在一个空间层面调查从一个空间层面中完全最小化的最小化, 以便从未充分抽样的Gaussia测量中恢复梯度偏差信号。 最近为要求的取样率设定的界限显示, $\ mathbb{R ⁇ n$ 中所有美元渐变信号的统一回收只能用$\ gtrsim\ sqrt{s n}\ cdott \ text{polyLog} (n) 来进行。 对于高维度问题来说, 美元比美元低得多的 。 然而, 以往的经验发现似乎表明, 这个取样率并不反映全部变差最小化的典型行为。 当前的工作提供了严格的分析, 打破$\ sqrt{ s n} $btrts n} 的大型“ 自然” 信号。 主要结果显示, 以 $m\ gtrentrs s\ trouplations supliveral- clodiversal commations commission commission 。