Stein Variational Gradient Descent (SVGD) is an important alternative to the Langevin-type algorithms for sampling from probability distributions of the form $\pi(x) \propto \exp(-V(x))$. In the existing theory of Langevin-type algorithms and SVGD, the potential function $V$ is often assumed to be $L$-smooth. However, this restrictive condition excludes a large class of potential functions such as polynomials of degree greater than $2$. Our paper studies the convergence of the SVGD algorithm for distributions with $(L_0,L_1)$-smooth potentials. This relaxed smoothness assumption was introduced by Zhang et al. [2019a] for the analysis of gradient clipping algorithms. With the help of trajectory-independent auxiliary conditions, we provide a descent lemma establishing that the algorithm decreases the $\mathrm{KL}$ divergence at each iteration and prove a complexity bound for SVGD in the population limit in terms of the Stein Fisher information.
翻译:Stein Variational Gradient Emprole (SVGD) 是一种重要的替代方法,可以替代Langevin型算法(SVGD) 类型算法(SVGD), 用于从 $\ pi(x)\ propto\ exp(-V(x))) 美元表的概率分布进行取样。 在Langevin 型算法和 SVGD 的现有理论中, 潜在函数 $V$ 通常被假定为 $- smooth 。 但是, 这一限制性条件排除了一大批潜在功能, 如多倍于$2美元的多元值。 我们的文件研究了 SVGD 算法与 $( L_0, L_ 1) $- smooth 潜能的组合。 张等人 [2019a] 提出的这种宽松的平稳假设是用来分析梯度剪切算法的。 在依赖轨迹的辅助条件的帮助下, 我们提供了一种血浆, 确定该算法在每一次循环中降低 $ mathrm {K} 差差差差 并证明SVGDGDDGD 在人口限制 中, 的复杂程度。