We propose a Langevin diffusion-based algorithm for non-convex optimization and sampling on a product manifold of spheres. Under a logarithmic Sobolev inequality, we establish a guarantee for finite iteration convergence to the Gibbs distribution in terms of Kullback--Leibler divergence. We show that with an appropriate temperature choice, the suboptimality gap to the global minimum is guaranteed to be arbitrarily small with high probability. As an application, we consider the Burer--Monteiro approach for solving a semidefinite program (SDP) with diagonal constraints, and analyze the proposed Langevin algorithm for optimizing the non-convex objective. In particular, we establish a logarithmic Sobolev inequality for the Burer--Monteiro problem when there are no spurious local minima, but under the presence saddle points. Combining the results, we then provide a global optimality guarantee for the SDP and the Max-Cut problem. More precisely, we show that the Langevin algorithm achieves $\epsilon$ accuracy with high probability in $\widetilde{\Omega}( \epsilon^{-5} )$ iterations.
翻译:我们提出一个基于Langevin的基于扩散的算法,用于非碳化优化和对不同领域的产品进行取样。在对数 Sobolev 不平等的情况下,我们为有限迭代与Gibs分布的结合设定了保证,即Kullback-Leiber差异。我们表明,如果有适当的温度选择,那么对于全球最低值而言,低于最佳值的差距保证是任意的,而且概率很高。作为一个应用,我们考虑用Burer-Monteiro 方法来解决一个具有对数限制的半确定值程序,并分析拟议的优化非convex目标的Langevin 算法。特别是,当没有虚假的地方迷你时,我们为Burer-Monteiro问题建立了一个对数的 Sobolev 不平等。我们把结果合并起来,然后为SDP和Max-Cut问题提供全球最佳性保证。更准确地说,我们证明Langevin 算法在美元和高概率的全局范围价格中实现了美元/5lon。