This paper studies the stochastic optimization for decentralized nonconvex-strongly-concave minimax problem. We propose a simple and efficient algorithm, called Decentralized Recursive gradient descEnt Ascent Method (DREAM), which requires $\mathcal{O}(\kappa^3\epsilon^{-3})$ stochastic first-order oracle (SFO) calls and $\mathcal{O}\big(\kappa^2\epsilon^{-2}/\sqrt{1-\lambda_2(W)}\,\big)$ communication rounds to find an $\epsilon$-stationary point, where $\kappa$ is the condition number and $\lambda_2(W)$ is the second-largest eigenvalue of the gossip matrix $W$. To the best our knowledge, DREAM is the first algorithm whose SFO and communication complexities simultaneously achieve the optimal dependency on $\epsilon$ and $\lambda_2(W)$ for this problem.
翻译:本文研究分散的非convex强力组合微型Max 问题的随机优化。 我们提出一个简单高效的算法, 叫做“ 分散递增梯度偏移方法 ” (DREAM), 它需要$\ mathcal{O}( kappa}3\ epsilon}-3} ), 美元随机第一阶( SFO) 调用 和$\ mathcal{ O ⁇ big (\ kappa}2\ epsilon}-2} /\ skrt{1-\ lambda_ 2(W)\\\ big) 的通信回合, 以找到一个 $\ epsilon$- 固定点, $\ kappa$ 是条件数, $\ lambda_ 2(W) 是八卦矩阵的第二大电子价值 $W 。 据我们所知, DREAM 是第一个其SFO和通信复杂性同时达到 $\ 和 $\\ lambda_ 2(W) 美元最佳依赖这一问题的计算法 。