Stein Variational Gradient Descent (SVGD) is a popular sampling algorithm used in various machine learning tasks. It is well known that SVGD arises from a discretization of the kernelized gradient flow of the Kullback-Leibler divergence $D_{KL}\left(\cdot\mid\pi\right)$, where $\pi$ is the target distribution. In this work, we propose to enhance SVGD via the introduction of importance weights, which leads to a new method for which we coin the name $\beta$-SVGD. In the continuous time and infinite particles regime, the time for this flow to converge to the equilibrium distribution $\pi$, quantified by the Stein Fisher information, depends on $\rho_0$ and $\pi$ very weakly. This is very different from the kernelized gradient flow of Kullback-Leibler divergence, whose time complexity depends on $D_{KL}\left(\rho_0\mid\pi\right)$. Under certain assumptions, we provide a descent lemma for the population limit $\beta$-SVGD, which covers the descent lemma for the population limit SVGD when $\beta\to 0$. We also illustrate the advantages of $\beta$-SVGD over SVGD by experiments.
翻译:Stein Variational梯度底部( SVGD) 是用于各种机器学习任务的流行抽样算法 。 众所周知, SVGD 源自于Kullback- Leibler差分的分层梯度流的离散, 美元为D ⁇ KL ⁇ l ⁇ left( cdot\mid\pi\right) 美元, 目标分布为$\pi$。 在这项工作中, 我们提议通过引入重要重量来增强SVGD, 从而导致一种新的方法, 我们为此创造了$\beta$- SVGD。 在持续的时间和无限粒子制度中, 这种流流向平衡分布的偏差流时间为$\pi$, 由Stein Fisherish公司的信息量化, 取决于$rho_0 和 $pipi$。 这与Kellb- Leiber差流的分流非常不同, 时间复杂性取决于$- k-\\\\\\ leflefleg (r_0\mid\ pi) right) 。 根据某些假设, 我们提供了血- Lem- lem- lem- metama exma\\ slational degDGDDGDDDDDDM.