Gradients have been exploited in proposal distributions to accelerate the convergence of Markov chain Monte Carlo algorithms on discrete distributions. However, these methods require a natural differentiable extension of the target discrete distribution, which often does not exist or does not provide effective gradient guidance. In this paper, we develop a gradient-like proposal for any discrete distribution without this strong requirement. Built upon a locally-balanced proposal, our method efficiently approximates the discrete likelihood ratio via Newton's series expansion to enable a large and efficient exploration in discrete spaces. We show that our method can also be viewed as a multilinear extension, thus inheriting its desired properties. We prove that our method has a guaranteed convergence rate with or without the Metropolis-Hastings step. Furthermore, our method outperforms a number of popular alternatives in several different experiments, including the facility location problem, extractive text summarization, and image retrieval.
翻译:为了加快Markov连锁Monte Carlo算法在离散分布上的趋同,在分配提案中利用了梯度,加速了Markov连锁Monte Carlo算法在离散分布上的趋同。然而,这些方法要求目标离散分布的自然不同扩展,而目标离散分布往往不存在,或者没有提供有效的梯度指导。在本文件中,我们为没有这种强烈要求的任何离散分布制定了一种类似梯度的建议。根据一个地方平衡的建议,我们的方法有效地接近了通过牛顿系列扩展的离散概率比率,以便能够在离散空间进行大规模和高效的探索。我们表明,我们的方法也可以被视为多线性扩展,从而继承其想要的特性。我们证明,我们的方法与Motopolis-Hastings一步有保证的汇合率。此外,我们的方法在一些不同的实验中,包括设施定位问题、抽取文本拼凑和图像检索,超越了一些流行的替代方法。</s>