Sampling from Gibbs distributions $p(x) \propto \exp(-V(x)/\varepsilon)$ and computing their log-partition function are fundamental tasks in statistics, machine learning, and statistical physics. However, while efficient algorithms are known for convex potentials $V$, the situation is much more difficult in the non-convex case, where algorithms necessarily suffer from the curse of dimensionality in the worst case. For optimization, which can be seen as a low-temperature limit of sampling, it is known that smooth functions $V$ allow faster convergence rates. Specifically, for $m$-times differentiable functions in $d$ dimensions, the optimal rate for algorithms with $n$ function evaluations is known to be $O(n^{-m/d})$, where the constant can potentially depend on $m, d$ and the function to be optimized. Hence, the curse of dimensionality can be alleviated for smooth functions at least in terms of the convergence rate. Recently, it has been shown that similarly fast rates can also be achieved with polynomial runtime $O(n^{3.5})$, where the exponent $3.5$ is independent of $m$ or $d$. Hence, it is natural to ask whether similar rates for sampling and log-partition computation are possible, and whether they can be realized in polynomial time with an exponent independent of $m$ and $d$. We show that the optimal rates for sampling and log-partition computation are sometimes equal and sometimes faster than for optimization. We then analyze various polynomial-time sampling algorithms, including an extension of a recent promising optimization approach, and find that they sometimes exhibit interesting behavior but no near-optimal rates. Our results also give further insights on the relation between sampling, log-partition, and optimization problems.
翻译:从Gibbs分布$p(x)\propto\exp(-V(x)/\varepsilon)$采样和计算它们的对数分配函数是统计学、机器学习和统计物理中的基本任务。然而,对于非凸势$V$,虽然对于凸势已经有了有效的算法,但在最坏情况下,算法必然遭受维数灾难。对于优化问题,它可以看作是从低温限制转化而来的采样问题,对于光滑函数,已知其可获得更快的收敛速率。具体而言,对于$d$维中$m$次可微的函数,使用$n$个函数评估的算法的最优速率已知为$O(n^{-m/d})$,其中常数可能取决于$m$、$d$和要优化的函数。因此,至少在收敛速率方面,对于光滑函数,维度灾难可以得到缓解。最近,已经证明了类似快速率也可以通过次数多项式运行时间$O(n^{3.5})$来实现,在其中指数$3.5$独立于$m$或$d$。因此,我们自然地想问,是否可能实现与$m$和$d$无关的指数的类似速率采样和对数分配计算,是否可以在多项式时间内实现。我们展示了对于采样和对数分配计算的最优速率有时等于优化问题的速率,有时更快。然后,我们分析了各种多项式时间采样算法,包括最近一种有前途的优化方法的扩展形式,并发现它们有时会出现有趣的行为,但没有接近最优的速率。我们的结果还进一步说明了采样、对数分配和优化问题之间的关系。