This paper introduces a stochastic plug-and-play (PnP) sampling algorithm that leverages variable splitting to efficiently sample from a posterior distribution. The algorithm based on split Gibbs sampling (SGS) draws inspiration from the alternating direction method of multipliers (ADMM). It divides the challenging task of posterior sampling into two simpler sampling problems. The first problem depends on the likelihood function, while the second is interpreted as a Bayesian denoising problem that can be readily carried out by a deep generative model. Specifically, for an illustrative purpose, the proposed method is implemented in this paper using state-of-the-art diffusion-based generative models. Akin to its deterministic PnP-based counterparts, the proposed method exhibits the great advantage of not requiring an explicit choice of the prior distribution, which is rather encoded into a pre-trained generative model. However, unlike optimization methods (e.g., PnP-ADMM) which generally provide only point estimates, the proposed approach allows conventional Bayesian estimators to be accompanied by confidence intervals at a reasonable additional computational cost. Experiments on commonly studied image processing problems illustrate the efficiency of the proposed sampling strategy. Its performance is compared to recent state-of-the-art optimization and sampling methods.
翻译:本文介绍了一种随机插入生成(PnP)采样算法,它利用变量分割来有效地从后验分布中采样。该算法基于分裂 Gibbs 采样(SGS),灵感来自多重器的交替方向方法(ADMM)。它将后验采样的挑战性任务分成两个简单的采样问题。第一个问题依赖于似然函数,而第二个问题被解释为贝叶斯去噪问题,可以由深度生成模型轻松完成。具体而言,为了说明目的,本论文中实现了基于最先进的扩散生成模型的方法。类似于其确定性 PnP-based 同行,所提出的方法展示了不需要明确选择先验分布的巨大优势,而是将其编码为预先训练的生成模型。但是,与通常仅提供点估计的优化方法(例如 PnP-ADMM)不同,所提出的方法允许传统的贝叶斯估计器附带合理的额外计算成本的置信区间。在通常研究的图像处理问题上的实验说明了所提出的采样策略的效率。其性能与最新的基准优化和采样方法进行了比较。