In this paper, we consider the underdamped Langevin diffusion (ULD) and propose a numerical approximation using its associated ordinary differential equation (ODE). When used as a Markov Chain Monte Carlo (MCMC) algorithm, we show that the ODE approximation achieves a $2$-Wasserstein error of $\varepsilon$ in $\mathcal{O}\big(d^{\frac{1}{3}}/\varepsilon^{\frac{2}{3}}\big)$ steps under the standard smoothness and strong convexity assumptions on the target distribution. This matches the complexity of the randomized midpoint method proposed by Shen and Lee [NeurIPS 2019] which was shown to be order optimal by Cao, Lu and Wang. However, the main feature of the proposed numerical method is that it can utilize additional smoothness of the target log-density $f$. More concretely, we show that the ODE approximation achieves a $2$-Wasserstein error of $\varepsilon$ in $\mathcal{O}\big(d^{\frac{2}{5}}/\varepsilon^{\frac{2}{5}}\big)$ and $\mathcal{O}\big(\sqrt{d}/\varepsilon^{\frac{1}{3}}\big)$ steps when Lipschitz continuity is assumed for the Hessian and third derivative of $f$. By discretizing this ODE using a fourth order splitting method, we obtain a practical MCMC method that requires just three additional gradient evaluations in each step. In our experiment, where the target comes from a logistic regression, this method shows faster convergence compared to other unadjusted Langevin MCMC algorithms.
翻译:在本文中, 我们考虑朗氏扩散( UNLD) 不足, 并提议使用与其相关的普通差分方程( ODE) 的数值近似值。 当使用 Markov 链链 蒙特卡洛( MMC ) 算法时, 我们显示, 当使用 Markov 链 蒙特卡洛( MMC ) 的随机中点方法时, 运行的 Ofserpserstein 错误为$ 瓦瑟斯朗( UNLD ) (ULD) (ULD ), 并用 目标平滑度( UNL) 标准平滑度和目标分布的强烈直流化假设值( ODEFS) 。 这符合 Shen 和 Lee [ NeurIPS 2019] 的随机中点方法的复杂性。 然而, 拟议的数字方法的主要特征是, 它可以利用目标日志的更多平滑度 $。 更具体地说, 我们的 Orasselfsteal- sinal lax ( $__ rmal) lax) lax a.