Approximate Bayesian inference estimates descriptors of an intractable target distribution - in essence, an optimization problem within a family of distributions. For example, Langevin dynamics (LD) extracts asymptotically exact samples from a diffusion process because the time evolution of its marginal distributions constitutes a curve that minimizes the KL-divergence via steepest descent in the Wasserstein space. Parallel to LD, Stein variational gradient descent (SVGD) similarly minimizes the KL, albeit endowed with a novel Stein-Wasserstein distance, by deterministically transporting a set of particle samples, thus de-randomizes the stochastic diffusion process. We propose de-randomized kernel-based particle samplers to all diffusion-based samplers known as MCMC dynamics. Following previous work in interpreting MCMC dynamics, we equip the Stein-Wasserstein space with a fiber-Riemannian Poisson structure, with the capacity of characterizing a fiber-gradient Hamiltonian flow that simulates MCMC dynamics. Such dynamics discretizes into generalized SVGD (GSVGD), a Stein-type deterministic particle sampler, with particle updates coinciding with applying the diffusion Stein operator to a kernel function. We demonstrate empirically that GSVGD can de-randomize complex MCMC dynamics, which combine the advantages of auxiliary momentum variables and Riemannian structure, while maintaining the high sample quality from an interacting particle system.
翻译:例如,Langevin动态(LD)从一个扩散过程中提取了无随机精确的样本,因为其边缘分布的时间演变是一个曲线,通过瓦塞斯坦空间最陡峭的下降而将KL-Dive-Dive放大最小化。与LD平行,Stein变异性梯度下降(SVGD)也使KL最小化,尽管它具有新型的Stein-Wasserstein距离,但通过决定性地运输一组粒子样本,从而解除了随机性,使Stochassistic扩散过程的样本。我们建议将其边际分布的粒子采样器向所有以扩散为基础的样本(MMC)动态进行脱随机分解,从而通过在瓦塞斯坦-Wasserstein 空间与Siemannian Poisson 结构(SVGD) 一样,我们给Stein-Wasserstein 空间配置了一种纤维化的流质变缩图,同时将S-Vral 质变模型与Simal 系统(Skinal-GD) 的升级功能结合起来。