This paper studies the stochastic optimization for decentralized nonconvex-strongly-concave minimax problem. We propose a simple and efficient algorithm, called Decentralized Recursive-gradient descEnt Ascent Method (\texttt{DREAM}), which achieves the best-known theoretical guarantee for finding the $\epsilon$-stationary point of the primal function. For the online setting, the proposed method requires $\mathcal{O}(\kappa^3\epsilon^{-3})$ stochastic first-order oracle (SFO) calls and $\mathcal{O}\big(\kappa^2\epsilon^{-2}/\sqrt{1-\lambda_2(W)}\,\big)$ communication rounds to find an $\epsilon$-stationary point, where $\kappa$ is the condition number and $\lambda_2(W)$ is the second-largest eigenvalue of the gossip matrix~$W$. For the offline setting with totally $N$ component functions, the proposed method requires $\mathcal{O}\big(\kappa^2 \sqrt{N} \epsilon^{-2}\big)$ SFO calls and the same communication complexity as the online setting.
翻译:暂无翻译