In this paper, we develop two new randomized block-coordinate optimistic gradient algorithms to approximate a solution of nonlinear equations, which are called root-finding problems. Our first algorithm is non-accelerated with constant stepsizes, and achieves $\mathcal{O}(1/k)$ best-iterate convergence rate on $\mathbb{E}[ \Vert Gx^k\Vert^2]$ when the underlying operator $G$ is Lipschitz continuous and the equation $Gx = 0$ admits a weak Minty solution, where $\mathbb{E}[\cdot]$ is the expectation and $k$ is the iteration counter. Our second method is a new accelerated randomized block-coordinate optimistic gradient algorithm. We establish both $\mathcal{O}(1/k^2)$ and $o(1/k^2)$ last-iterate convergence rates on both $\mathbb{E}[ \Vert Gx^k\Vert^2]$ and $\mathbb{E}[ \Vert x^{k+1} - x^{k}\Vert^2]$ for this algorithm under the co-coerciveness of $G$. Then, we apply our methods to a class of finite-sum nonlinear inclusions which covers various applications in machine learning and statistical learning, especially in federated learning and network optimization. We obtain two new federated learning-type algorithms for this problem class with rigorous convergence rate guarantees.
翻译:在本文中, 我们开发了两个新的随机的区块坐标乐观梯度算法, 以近似于非线性方程式的解决方案, 称之为“ 根调查问题 ” 。 我们的第一个算法是不加速的, 并且以不断的阶梯化为折叠式( O) (1/ k) 实现 $\ mathbb{ E} [\ Vert Gx\ k\ Vert2] 的顶级操作员$ G$ 连续, 方程式 $x = 0 = 接受一个微软的 Minty 解决方案, 其中$\ mathbb{E} [\\ cdo] $ 是期望值, $ 和 $knational 折叠式( blickral) 的顶级递增缩缩缩缩放率。 我们用 $\ math{ O} (1/ k) 2 和 $ 美元 美元 美元 和 美元的顶级( k) 等积( we t\ vk) comlisteleaclecleining a lear lear learlearstelear) lear lear lecleclecleclecleclement a exlemental exlemental $xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx</s>