We develop two compression based stochastic gradient algorithms to solve a class of non-smooth strongly convex-strongly concave saddle-point problems in a decentralized setting (without a central server). Our first algorithm is a Restart-based Decentralized Proximal Stochastic Gradient method with Compression (C-RDPSG) for general stochastic settings. We provide rigorous theoretical guarantees of C-RDPSG with gradient computation complexity and communication complexity of order $\mathcal{O}( (1+\delta)^4 \frac{1}{L^2}{\kappa_f^2}\kappa_g^2 \frac{1}{\epsilon} )$, to achieve an $\epsilon$-accurate saddle-point solution, where $\delta$ denotes the compression factor, $\kappa_f$ and $\kappa_g$ denote respectively the condition numbers of objective function and communication graph, and $L$ denotes the smoothness parameter of the smooth part of the objective function. Next, we present a Decentralized Proximal Stochastic Variance Reduced Gradient algorithm with Compression (C-DPSVRG) for finite sum setting which exhibits gradient computation complexity and communication complexity of order $\mathcal{O} \left((1+\delta) \max \{\kappa_f^2, \sqrt{\delta}\kappa^2_f\kappa_g,\kappa_g \} \log\left(\frac{1}{\epsilon}\right) \right)$. Extensive numerical experiments show competitive performance of the proposed algorithms and provide support to the theoretical results obtained.
翻译:我们开发了两种基于压缩的随机梯度算法,用于解决去中心化设置(没有中央服务器)下的一类非光滑强凸强凹鞍点问题。我们的第一个算法是一种用于一般随机设置的基于重启的去中心化近端随机梯度方法和压缩(C-RDPSG)算法。我们提供了C-RDPSG的严格理论保证,梯度计算复杂度和通信复杂度为$\mathcal{O}( (1+\delta)^4 \frac{1}{L^2}{\kappa_f^2}\kappa_g^2 \frac{1}{\epsilon} )$,以实现$\epsilon$-准确的鞍点解,其中$\delta$表示压缩因子,$\kappa_f$和$\kappa_g$分别表示目标函数和通信图的条件数,$L$表示平滑部分的平滑度参数。接下来,我们提出了一种用于有限和设置的去中心化近端随机方差约减梯度和压缩(C-DPSVRG)算法,其展示了梯度计算复杂度和通信复杂度的阶数为$\mathcal{O} \left((1+\delta) \max \{\kappa_f^2, \sqrt{\delta}\kappa^2_f\kappa_g,\kappa_g \} \log\left(\frac{1}{\epsilon}\right) \right)$。广泛的数值实验表明,所提出的算法具有竞争性的性能,并提供了对所获得的理论结果的支持。