Sampling is a fundamental and arguably very important task with numerous applications in Machine Learning. One approach to sample from a high dimensional distribution $e^{-f}$ for some function $f$ is the Langevin Algorithm (LA). Recently, there has been a lot of progress in showing fast convergence of LA even in cases where $f$ is non-convex, notably [53], [39] in which the former paper focuses on functions $f$ defined in $\mathbb{R}^n$ and the latter paper focuses on functions with symmetries (like matrix completion type objectives) with manifold structure. Our work generalizes the results of [53] where $f$ is defined on a manifold $M$ rather than $\mathbb{R}^n$. From technical point of view, we show that KL decreases in a geometric rate whenever the distribution $e^{-f}$ satisfies a log-Sobolev inequality on $M$.
翻译:在机器学习中,取样是一项根本性的、可能非常重要的任务,在机器学习中有许多应用。从高维分布中抽取一个用于某种功能的美元-f美元(f美元)的样本,其中一种方法是朗埃文·阿尔哥里特姆(LA),最近,在显示LA快速趋同方面,取得了许多进展,即使美元不是CONVx,特别是[53],[39]美元,前者侧重于美元定义的功能($mathbb{R ⁇ n$),而后者侧重于具有多种结构的对称功能(如矩阵完成型目标)。我们的工作概括了[53]美元的结果,其中,美元的定义是按$$($)而不是按$(mathb{R ⁇ n$)确定。从技术角度看,我们表明,每当分配 $(e ⁇ -f)美元满足了以美元为单位的log-Soolev不平等时,KL的几何率就会下降。