We focus on the commonly used synchronous Gradient Descent paradigm for large-scale distributed learning, for which there has been a growing interest to develop efficient and robust gradient aggregation strategies that overcome two key system bottlenecks: communication bandwidth and stragglers' delays. In particular, Ring-AllReduce (RAR) design has been proposed to avoid bandwidth bottleneck at any particular node by allowing each worker to only communicate with its neighbors that are arranged in a logical ring. On the other hand, Gradient Coding (GC) has been recently proposed to mitigate stragglers in a master-worker topology by allowing carefully designed redundant allocation of the data set to the workers. We propose a joint communication topology design and data set allocation strategy, named CodedReduce (CR), that combines the best of both RAR and GC. That is, it parallelizes the communications over a tree topology leading to efficient bandwidth utilization, and carefully designs a redundant data set allocation and coding strategy at the nodes to make the proposed gradient aggregation scheme robust to stragglers. In particular, we quantify the communication parallelization gain and resiliency of the proposed CR scheme, and prove its optimality when the communication topology is a regular tree. Moreover, we characterize the expected run-time of CR and show order-wise speedups compared to the benchmark schemes. Finally, we empirically evaluate the performance of our proposed CR design over Amazon EC2 and demonstrate that it achieves speedups of up to 27.2x and 7.0x, respectively over the benchmarks GC and RAR.


翻译:我们的重点是用于大规模分布式学习的常用同步渐变底部模式,为此,人们越来越有兴趣制定高效和稳健的梯度汇总战略,以克服两个关键的系统瓶颈:通信带宽和排缩器的延迟。特别是,我们提议了环-AllReduce(RAR)设计,以避免在任何特定节点出现带宽瓶颈,允许每个工人仅与在逻辑环状中安排的邻居进行沟通,从而避免在任何特定节点出现带宽瓶颈。另一方面,最近提议了渐变编码(GC),以在总工人的表层中减少累进器。7 允许精心设计多余的向工人分配数据集。我们提议了联合通信表层设计和数据集分配战略,称为代码Reg-AREduce(RRRR),将Ring-ART(RRR)的最好组合和数据配置战略结合起来。 也就是说,我们将拟议的递增递增的递增的递增和递增的递增的递增性组合计划与预估性战略,我们对比了通信的预估性设计、预估性计划。

0
下载
关闭预览

相关内容

专知会员服务
44+阅读 · 2020年10月31日
【DeepMind】强化学习教程,83页ppt
专知会员服务
147+阅读 · 2020年8月7日
迁移学习简明教程,11页ppt
专知会员服务
105+阅读 · 2020年8月4日
【Google】平滑对抗训练,Smooth Adversarial Training
专知会员服务
46+阅读 · 2020年7月4日
Python分布式计算,171页pdf,Distributed Computing with Python
专知会员服务
105+阅读 · 2020年5月3日
人工智能 | UAI 2019等国际会议信息4条
Call4Papers
6+阅读 · 2019年1月14日
已删除
将门创投
7+阅读 · 2018年11月5日
Arxiv
18+阅读 · 2020年7月13日
Arxiv
43+阅读 · 2019年12月20日
VIP会员
相关资讯
人工智能 | UAI 2019等国际会议信息4条
Call4Papers
6+阅读 · 2019年1月14日
已删除
将门创投
7+阅读 · 2018年11月5日
Top
微信扫码咨询专知VIP会员