In the paper, we study a class of useful minimax optimization problems on Riemanian manifolds and propose a class of Riemanian gradient-based methods to solve these minimax problems. Specifically, we propose a Riemannian gradient descent ascent (RGDA) algorithm for the deterministic minimax optimization. Moreover, we prove that our RGDA has a sample complexity of $O(\kappa^2\epsilon^{-2})$ for finding an $\epsilon$-stationary point of the nonconvex strongly-concave minimax problems, where $\kappa$ denotes the condition number. At the same time, we introduce a Riemannian stochastic gradient descent ascent (RSGDA) algorithm for the stochastic minimax optimization. In the theoretical analysis, we prove that our RSGDA can achieve a sample complexity of $O(\kappa^4\epsilon^{-4})$. To further reduce the sample complexity, we propose an accelerated Riemannian stochastic gradient descent ascent (Acc-RSGDA) algorithm based on the variance-reduced technique. We prove that our Acc-RSGDA algorithm achieves a lower sample complexity of $\tilde{O}(\kappa^{4}\epsilon^{-3})$. Extensive experimental results on the robust distributional optimization and Deep Neural Networks (DNNs) training over Stiefel manifold demonstrate efficiency of our algorithms.
翻译:在论文中,我们研究了一系列关于里马尼亚地块的有用的小型最大优化问题,并提出了一系列基于里马尼亚梯度的基于里马尼亚梯度的方法来解决这些小型问题。具体地说,我们提议为确定性小型马克斯优化采用里曼尼梯度梯度梯度梯度下降(RGDA)算法(RGDA)算法(RGDA)算法。此外,我们证明我们的REGDA的样本复杂性为$O(\kappa2\\epsilon ⁇ 2}(2)美元),以寻找一个非Conex 强可调小型马克斯 的固定点。为了进一步降低样本复杂性,我们提议加速里曼尼卡普亚深度梯度梯度下降(RGD)算法(Acceptroupical-alislationalislationalislational 4}(SBErationalian sloia-rational-rassional)算法(Acc_SDrevia-Sqrational-rational-roislationaltra)算法(SDreval_Sqrevational_Sqlational_Sqlationxxxxxx),以证明我们的低级的低级变缩缩算法(A)算方法。