Let $V_* : \mathbb{R}^d \to \mathbb{R}$ be some (possibly non-convex) potential function, and consider the probability measure $\pi \propto e^{-V_*}$. When $\pi$ exhibits multiple modes, it is known that sampling techniques based on Wasserstein gradient flows of the Kullback-Leibler (KL) divergence (e.g. Langevin Monte Carlo) suffer poorly in the rate of convergence, where the dynamics are unable to easily traverse between modes. In stark contrast, the work of Lu et al. (2019; 2022) has shown that the gradient flow of the KL with respect to the Fisher-Rao (FR) geometry exhibits a convergence rate to $\pi$ is that \textit{independent} of the potential function. In this short note, we complement these existing results in the literature by providing an explicit expansion of $\text{KL}(\rho_t^{\text{FR}}\|\pi)$ in terms of $e^{-t}$, where $(\rho_t^{\text{FR}})_{t\geq 0}$ is the FR gradient flow of the KL divergence. In turn, we are able to provide a clean asymptotic convergence rate, where the burn-in time is guaranteed to be finite. Our proof is based on observing a similarity between FR gradient flows and simulated annealing with linear scaling, and facts about cumulant generating functions. We conclude with simple synthetic experiments that demonstrate our theoretical findings are indeed tight. Based on our numerics, we conjecture that the asymptotic rates of convergence for Wasserstein-Fisher-Rao gradient flows are possibly related to this expansion in some cases.
翻译:Let $V ⁇ :\ mathbb{R ⁇ d\ to mathb{R} $ 是一个( 可能是非convex ) 潜在功能, 并且考虑概率测量 $\ pi\ p proto e ⁇ - V ⁇ $。 当 $\ pi$ 显示多种模式时, 已知基于 Kullback- Leiber (KL) 的瓦瑟斯坦梯度流的采样技术( 例如 Langevin Monte Carlo), 其趋同速度不易在模式之间移动。 相比之下, Lu et al. (2019; 2022) 的计算表明, KL 相对于 Fisher- Rao (Fr) 的梯度流显示, KL的渐变速度是 $\\\ tiple{ 。 在本文中, 我们的文献中的这些结果与 $\ text{K} 和 rodeal_ true rol_ true 有关, roqual= roq roq roq roq roq rol= rol= roq roq roq roq ro) 。