We provide a nonasymptotic analysis of the convergence of the stochastic gradient Hamiltonian Monte Carlo (SGHMC) to a target measure in Wasserstein-2 distance without assuming log-concavity. Our analysis quantifies key theoretical properties of the SGHMC as a sampler under local conditions which significantly improves the findings of previous results. In particular, we prove that the Wasserstein-2 distance between the target and the law of the SGHMC is uniformly controlled by the step-size of the algorithm, therefore demonstrate that the SGHMC can provide high-precision results uniformly in the number of iterations. The analysis also allows us to obtain nonasymptotic bounds for nonconvex optimization problems under local conditions and implies that the SGHMC, when viewed as a nonconvex optimizer, converges to a global minimum with the best known rates. We apply our results to obtain nonasymptotic bounds for scalable Bayesian inference and nonasymptotic generalization bounds.
翻译:我们的分析将SGHMC在当地条件下作为样本的主要理论属性量化,大大改进了以往结果的发现。特别是,我们证明,目标与SGHMC法律之间的瓦瑟斯坦-2距离由算法的分级大小统一控制,因此,SGHMC可以在迭代数中统一提供高精度结果。 分析还使我们能够在当地条件下获得非convex优化问题的非保全界限,并意味着SGHMC在被视为非convex优化器时,会与已知最高速率相趋同,达到全球最低值。我们运用我们的结果,为可伸缩的Bayesian推断和非防腐蚀性一般化框取得非保全性界限。