Discretization of continuous-time diffusion processes is a widely recognized method for sampling. However, the canonical Euler Maruyama discretization of the Langevin diffusion process, referred as Langevin Monte Carlo (LMC), studied mostly in the context of smooth (gradient Lipschitz) and strongly log-concave densities, is a considerable hindrance for its deployment in many sciences, including computational statistics and statistical learning. In this paper, we establish several theoretical contributions to the literature on such sampling methods for weakly smooth and non-convex densities. Particularly, we use convexification of nonconvex domain \citep{ma2019sampling} in combination with regularization to prove convergence in Kullback-Leibler (KL) divergence with the number of iterations to reach $\epsilon-$ neighborhood of a target distribution in only polynomial dependence on the dimension. We relax the conditions of \citep{vempala2019rapid} and prove convergence guarantees under isoperimetry, degenerated convex, and non strongly convex at infinity.
翻译:连续时间扩散过程的分解是广泛公认的取样方法,然而,主要在平滑(梯度Lipschitz)和极强的日对焦密度的背景下研究的Langevin Monte Carlo(LMC),对Langevin 扩散过程的Cancial Euler Maruyama分解,是在许多科学中,包括计算统计和统计学学习中应用这种连续时间扩散过程的巨大障碍。在本文件中,我们为这种抽样方法的文献确立了一些理论贡献,用于不平稳和非凝固密度。特别是,我们使用非convex域域\citep{ma2019samping}的混凝化法,同时在Kullback-Leiber (KL) 中证明趋同到只对维度多位依赖目标分布的美元-美元相邻。我们放松了citep{vepala2019rapid} 的条件,并证明在同位素测量、退化的等同度下,以及非强烈的同质的结合保证。