Diffusion models have shown exceptional performance in solving inverse problems. However, one major limitation is the slow inference time. While faster diffusion samplers have been developed for unconditional sampling, there has been limited research on conditional sampling in the context of inverse problems. In this study, we propose a novel and efficient diffusion sampling strategy that employs the geometric decomposition of diffusion sampling. Specifically, we discover that the samples generated from diffusion models can be decomposed into two orthogonal components: a ``denoised" component obtained by projecting the sample onto the clean data manifold, and a ``noise" component that induces a transition to the next lower-level noisy manifold with the addition of stochastic noise. Furthermore, we prove that, under some conditions on the clean data manifold, the conjugate gradient update for imposing conditioning from the denoised signal belongs to the clean manifold, resulting in a much faster and more accurate diffusion sampling. Our method is applicable regardless of the parameterization and setting (i.e., VE, VP). Notably, we achieve state-of-the-art reconstruction quality on challenging real-world medical inverse imaging problems, including multi-coil MRI reconstruction and 3D CT reconstruction. Moreover, our proposed method achieves more than 80 times faster inference time than the previous state-of-the-art method.
翻译:扩散模型在解决反面问题方面表现出了非凡的性能。然而,一个主要的限制是缓慢的发酵时间。虽然为无条件抽样开发了较快的传播采样器,但是在反面问题的背景下,对有条件采样的研究有限。在本研究中,我们提出了一个新颖和有效的传播采样战略,利用扩散采样的几何分解法。具体地说,我们发现,扩散模型产生的样品可以分解成两个或几何成分:通过将样品投射到清洁数据元件获得的“隐含”组件和“噪音”组件,通过添加随机噪音,促使向下一层低层的噪音过渡。此外,我们证明,在清洁数据元件的某些条件下,对从分解信号中施加调节的梯度更新属于清洁的元件,从而导致更快和更精确的传播采样。我们的方法是适用的,不管采样的参数和设置如何(例如,VE,VP)。值得注意的是,我们实现了在挑战性现实-C-C-C-M-C-C-C-M-C-C-C-C-C-C-C-C-C-FD-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-S-S-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C</s>