Many problems in machine learning can be reduced to learning a low-rank positive semidefinite matrix (denoted as $Z$), which encounters semidefinite program (SDP). Existing SDP solvers by classical convex optimization are expensive to solve large-scale problems. Employing the low rank of solution, Burer-Monteiro's method reformulated SDP as a nonconvex problem via the $quadratic$ factorization ($Z$ as $XX^\top$). However, this would lose the structure of problem in optimization. In this paper, we propose to convert SDP into a biconvex problem via the $bilinear$ factorization ($Z$ as $XY^\top$), and while adding the term $\frac{\gamma}{2}||X-Y||_F^2$ to penalize the difference of $X$ and $Y$. Thus, the biconvex structure (w.r.t. $X$ and $Y$) can be exploited naturally in optimization. As a theoretical result, we provide a bound to the penalty parameter $\gamma$ under the assumption of $L$-Lipschitz smoothness and $\sigma $-strongly biconvexity, such that, at stationary points, the proposed bilinear factorization is equivalent to Burer-Monteiro's factorization when the bound is arrived, that is $\gamma>\frac{1}{4}(L-\sigma)_+$. Our proposal opens up a new way to surrogate SDP by biconvex program. Experiments on two SDP-related applications demonstrate that the proposed method is effective as the state-of-the-art.
翻译:机器学习中的许多问题可以归结为学习一个低水平正正半确定基质矩阵(以Z美元计 ), 它会遇到半确定基质方案(SDP ) 。 现有的SDP解决方案通过古典Convex优化解决了大规模问题。 使用低级解决方案, Burer- Monteiro的方法将SDP改成一个非默认问题, 通过 $qualtic 乘数( $XX美元计数 ) 。 但是, 这将在优化中失去问题结构。 在本文中, 我们提议将SDP转换成双孔x问题, 通过 $bilate 乘数( $X% ) 的乘数( $xtop 美元) 。 在Sloriaxlical 程序假设中, $xxxxxxxx 的比值( 美元), 将Sconvex 结构( 美元和 $Y$) 相近。 在优化中, 以理论结果将Slammamamamamamal 等值的值设定值设定值设定值设定值的Sal- pal- 。 Sal- pal- paslation (美元 des) 。 Slation) 。 Slation (美元 。 Slationalationalationalization) 。