We study the problem of distributed mean estimation and optimization under communication constraints. We propose a correlated quantization protocol whose leading term in the error guarantee depends on the mean deviation of data points rather than only their absolute range. The design doesn't need any prior knowledge on the concentration property of the dataset, which is required to get such dependence in previous works. We show that applying the proposed protocol as sub-routine in distributed optimization algorithms leads to better convergence rates. We also prove the optimality of our protocol under mild assumptions. Experimental results show that our proposed algorithm outperforms existing mean estimation protocols on a diverse set of tasks.
翻译:我们研究了在通信限制下分布平均估计和优化的问题。 我们提出了一个相关的量化协议, 其错误保证中的主要术语取决于数据点的偏差, 而不仅仅是其绝对范围。 设计不需要事先知道数据集的集中属性, 而这是在以往工作中获得这种依赖的必要条件。 我们显示, 将拟议的协议作为子常规应用于分布优化算法可以提高趋同率。 我们还证明, 在温和假设下,我们的协议是最佳的。 实验结果显示,我们提议的算法比目前关于不同任务的现有平均估计程序要好。