Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions. However, a major limitation of HMC is its inability to be applied to discrete domains due to the lack of gradient signal. In this work, we introduce a new approach based on augmenting Monte Carlo methods with SurVAE Flows to sample from discrete distributions using a combination of neural transport methods like normalizing flows and variational dequantization, and the Metropolis-Hastings rule. Our method first learns a continuous embedding of the discrete space using a surjective map and subsequently learns a bijective transformation from the continuous space to an approximately Gaussian distributed latent variable. Sampling proceeds by simulating MCMC chains in the latent space and mapping these samples to the target discrete space via the learned transformations. We demonstrate the efficacy of our algorithm on a range of examples from statistics, computational physics and machine learning, and observe improvements compared to alternative algorithms.
翻译:Monte Carlo 混合体是一种强大的Markov链条蒙特卡洛方法,用于从复杂的连续分布中取样。然而,HMC的一个主要限制是,由于缺乏梯度信号,它无法应用于离散域。在这项工作中,我们引入了一种新的方法,其基础是用SurVAE流来增加Monte Carlo方法,然后通过使用神经分解方法的混合方法,从离散分布中取样,例如正常流动和变异分解,以及大都会-Hostings规则。我们的方法首先利用隐射图来了解离散空间的连续嵌入,然后学习从连续空间到约高斯分布的隐性变量的双向转换。通过模拟潜层空间的 MMC链收集这些样品的收益,然后通过学习的变异,将这些样品绘制到目标离散空间。我们用算法在统计、计算物理和机器学习等一系列例子上的效率,并观察替代算法的改进。