We study finite-sum distributed optimization problems with $n$-clients under popular $\delta$-similarity condition and $\mu$-strong convexity. We propose two new algorithms: SVRS and AccSVRS motivated by previous works. The non-accelerated SVRS method combines the techniques of gradient-sliding and variance reduction, which achieves superior communication complexity $\tilde{\gO}(n {+} \sqrt{n}\delta/\mu)$ compared to existing non-accelerated algorithms. Applying the framework proposed in Katyusha X, we also build a direct accelerated practical version named AccSVRS with totally smoothness-free $\tilde{\gO}(n {+} n^{3/4}\sqrt{\delta/\mu})$ communication complexity that improves upon existing algorithms on ill-conditioning cases. Furthermore, we show a nearly matched lower bound to verify the tightness of our AccSVRS method.
翻译:基于平均二阶相似梯度分布式随机优化:算法及其分析
翻译摘要:
我们研究了 $n$ 个客户端下受欢迎的 $\delta$-相似条件和 $\mu$-强凸性制约的有限和分布式优化问题。我们提出了两种新算法:SVRS 和 AccSVRS,这些算法是基于之前的研究工作。SVRS 方法是一种非加速版本,结合了梯度滑动和方差减少技术,实现了卓越的通信复杂度 $\tilde{\gO}(n {+} \sqrt{n}\delta/\mu)$,相比现有的非加速算法有更好的表现。我们还构建了一个名为 AccSVRS 的直接加速实用版本,该版本应用 Katyusha X 中提出的框架,完全不需要平滑度,通信复杂度为 $\tilde{\gO}(n {+} n^{3/4}\sqrt{\delta/\mu})$,在病态情况下可以改善现有算法。此外,我们展示了一个几乎匹配的下界,以验证我们的 AccSVRS 方法的紧密性。