Information compression is essential to reduce communication cost in distributed optimization over peer-to-peer networks. This paper proposes a communication-efficient linearly convergent distributed (COLD) algorithm to solve strongly convex optimization problems. By compressing innovation vectors, which are the differences between decision vectors and their estimates, COLD is able to achieve linear convergence for a class of $\delta$-contracted compressors. We explicitly quantify how the compression affects the convergence rate and show that COLD matches the same rate of its uncompressed version. To accommodate a wider class of compressors that includes the binary quantizer, we further design a novel dynamical scaling mechanism and obtain the linearly convergent Dyna-COLD. Importantly, our results strictly improve existing results for the quantized consensus problem. Numerical experiments demonstrate the advantages of both algorithms under different compressors.
翻译:信息压缩对于降低对同行网络的分布优化的通信成本至关重要。 本文建议采用通信效率线性分布式分布式( COLD) 算法, 以解决强烈的 convex 优化问题。 压缩创新矢量( 即决定矢量与其估计值之间的差异), COLD 能够实现某类 $\delta$ 承包的压缩压缩机的线性趋同。 我们明确量化压缩如何影响聚合率, 并显示COLD 与其未压缩版本的相同速度。 为了容纳包括二进制量化器在内的范围更广的压缩机, 我们进一步设计一个新的动态缩放机制, 并获得线性趋同的 Dyna- COLD 。 重要的是, 我们的结果严格地改善了四分化共识问题的现有结果。 数字实验展示了不同压缩器下两种算法的优势 。