In the paper, we study a class of useful non-convex minimax optimization problems on Riemanian manifolds and propose a class of Riemanian gradient descent ascent algorithms to solve these minimax problems. Specifically, we propose a new Riemannian gradient descent ascent (RGDA) algorithm for the \textbf{deterministic} minimax optimization. Moreover, we prove that the RGDA has a sample complexity of $O(\kappa^2\epsilon^{-2})$ for finding an $\epsilon$-stationary point of the nonconvex strongly-concave minimax problems, where $\kappa$ denotes the condition number. At the same time, we introduce a Riemannian stochastic gradient descent ascent (RSGDA) algorithm for the \textbf{stochastic} minimax optimization. In the theoretical analysis, we prove that the RSGDA can achieve a sample complexity of $O(\kappa^3\epsilon^{-4})$. To further reduce the sample complexity, we propose a novel momentum variance-reduced Riemannian stochastic gradient descent ascent (MVR-RSGDA) algorithm based on the momentum-based variance-reduced technique of STORM. We prove that the MVR-RSGDA algorithm achieves a lower sample complexity of $\tilde{O}(\kappa^{(3-\nu/2)}\epsilon^{-3})$ for $\nu \geq 0$, which reaches the best known sample complexity for its Euclidean counterpart. Extensive experimental results on the robust deep neural networks training over Stiefel manifold demonstrate the efficiency of our proposed algorithms.
翻译:在论文中,我们研究了在里曼尼方块上一个有用的非convex小型马克斯优化问题类别,并提议了一类里曼尼梯度梯度下移算法,以解决这些小型马克斯问题。具体地说,我们提议了一种新的里曼尼梯度梯度下移(RGDA)算法,用于\ textbf{determinististic}小型马克斯优化。此外,我们证明RGDA的样本复杂性为美元(kappa_2\epsilón_2}),用于寻找非尼曼尼卡尼基亚梯度梯度的固定点。我们证明,RSGDA的样本复杂性可以达到美元(kaplax$3) 坚固性基底点。