It is known that the minimum-mean-squared-error (MMSE) denoiser under Gaussian noise can be written as a proximal operator, which suffices for asymptotic convergence of plug-and-play (PnP) methods but does not reveal the structure of the induced regularizer or give convergence rates. We show that the MMSE denoiser corresponds to a regularizer that can be written explicitly as an upper Moreau envelope of the negative log-marginal density, which in turn implies that the regularizer is 1-weakly convex. Using this property, we derive (to the best of our knowledge) the first sublinear convergence guarantee for PnP proximal gradient descent with an MMSE denoiser. We validate the theory with a one-dimensional synthetic study that recovers the implicit regularizer. We also validate the theory with imaging experiments (deblurring and computed tomography), which exhibit the predicted sublinear behavior.
翻译:已知高斯噪声下的最小均方误差(MMSE)去噪器可表示为邻近算子,这足以保证即插即用(PnP)方法的渐近收敛性,但未能揭示诱导正则项的结构或给出收敛速率。我们证明MMSE去噪器对应一个可显式表示为负对数边缘密度的上莫罗包络的正则项,进而表明该正则项具有1-弱凸性。利用此性质,我们首次(据我们所知)推导出基于MMSE去噪器的PnP邻近梯度下降法的次线性收敛保证。我们通过一维合成实验验证了该理论,该实验成功恢复了隐式正则项。同时通过成像实验(去模糊和计算机断层扫描)验证了理论,实验结果展现出预测的次线性收敛行为。