We study the decentralized consensus and stochastic optimization problems with compressed communications over static directed graphs. We propose an iterative gradient-based algorithm that compresses messages according to a desired compression ratio. The proposed method provably reduces the communication overhead on the network at every communication round. Contrary to existing literature, we allow for arbitrary compression ratios in the communicated messages. We show a linear convergence rate for the proposed method on the consensus problem. Moreover, we provide explicit convergence rates for decentralized stochastic optimization problems on smooth functions that are either (i) strongly convex, (ii) convex, or (iii) non-convex. Finally, we provide numerical experiments to illustrate convergence under arbitrary compression ratios and the communication efficiency of our algorithm.
翻译:我们研究了在静态定向图形上压缩通信的分散式共识和随机优化问题。我们建议了一种迭代梯度计算法,根据理想的压缩比率压缩信息。建议的方法可以降低网络在每一轮通信中的间接成本。与现有的文献相反,我们允许在传递的信息中任意压缩比例。我们显示了关于协商一致问题的拟议方法的线性趋同率。此外,我们为(一) 强烈的电流、(二) 电流或(三) 非电流的平流功能的分散式优化问题提供了明确的趋同率。最后,我们提供了数字实验,以说明在任意压缩比率和我们算法的通信效率下的趋同率。