We consider solving high-order semidefinite programming (SDP) relaxations of nonconvex polynomial optimization problems (POPs) that often admit degenerate rank-one optimal solutions. Instead of solving the SDP alone, we propose a new algorithmic framework that blends local search using the nonconvex POP into global descent using the convex SDP. In particular, we first design a globally convergent inexact projected gradient method (iPGM) for solving the SDP that serves as the backbone of our framework. We then accelerate iPGM by taking long, but safeguarded, rank-one steps generated by fast nonlinear programming algorithms. We prove that the new framework is still globally convergent for solving the SDP. To solve the iPGM subproblem of projecting a given point onto the feasible set of the SDP, we design a two-phase algorithm with phase one using a symmetric Gauss-Seidel based accelerated proximal gradient method (sGS-APG) to generate a good initial point, and phase two using a modified limited-memory BFGS (L-BFGS) method to obtain an accurate solution. We analyze the convergence for both phases and establish a novel global convergence result for the modified L-BFGS that does not require the objective function to be twice continuously differentiable. We conduct numerical experiments for solving second-order SDP relaxations arising from a diverse set of POPs. Our framework demonstrates state-of-the-art efficiency, scalability, and robustness in solving degenerate rank-one SDPs to high accuracy, even in the presence of millions of equality constraints.
翻译:我们考虑解决高阶半不定期编程(SDP)的松动,这些非convex多元优化化问题往往会接受低级一级最佳解决方案。我们建议一个新的算法框架,将使用非convex POP的本地搜索与全球下降混为一体。特别是,我们首先设计一个全球趋同性预测梯度法(iPGM),以解决作为我们框架支柱的SDP。然后我们加快iPGM,采取长期但有保障的一等级步骤,由快速非线性编程算法生成。我们证明,新框架仍然在全球趋同,以解决SDP。要将iPGM的子问题投放到SDP的可行数据集上,我们设计一个两阶段的算法,用一个比对称性高的标值-Seidel(iPGFG),用一个良好的初始点,而第二个阶段则是使用经过修正的有限级的BFGS(LGS-DP) 的精度趋同性标准,我们用一种不精确的精度方法来分析S-BIS的精度的精度的精度的精度的精度的精度。