The communication cost of distributed optimization algorithms is a major bottleneck in their scalability. This work considers a parameter-server setting in which the worker is constrained to communicate information to the server using only $R$ bits per dimension. We show that $\mathbf{democratic}$ $\mathbf{embeddings}$ from random matrix theory are significantly useful for designing efficient and optimal vector quantizers that respect this bit budget. The resulting polynomial complexity source coding schemes are used to design distributed optimization algorithms with convergence rates matching the minimax optimal lower bounds for (i) Smooth and Strongly-Convex objectives with access to an Exact Gradient oracle, as well as (ii) General Convex and Non-Smooth objectives with access to a Noisy Subgradient oracle. We further propose a relaxation of this coding scheme which is nearly minimax optimal. Numerical simulations validate our theoretical claims.
翻译:分布式优化算法的通信成本是其可缩放性的一个主要瓶颈。 这项工作考虑了一个参数服务器设置, 使工人只能使用每个维度的美元位元向服务器传递信息。 我们显示, 随机矩阵理论的$\ mathbf{ 民主} $\ mathbf{ embeddings} $\\ mathbf} 美元对于设计尊重此位预算的高效和最佳矢量量化器非常有用。 由此产生的多元复杂源代码计划用于设计分布式优化算法, 其趋同率与( 一) 光滑和强的Convex 目标相匹配, 并可以访问超缩缩缩缩缩缩轴, 以及 (二) 通用Convex 和非 Smoooot 目标, 并可以访问 noisy Subgradicent orcle。 我们还建议放松这个几乎微缩缩放最佳的调控算法。 数字模拟证实了我们的理论主张。