Distributed implementations are crucial in speeding up large scale machine learning applications. Distributed gradient descent (GD) is widely employed to parallelize the learning task by distributing the dataset across multiple workers. A significant performance bottleneck for the per-iteration completion time in distributed synchronous GD is $straggling$ workers. Coded distributed computation techniques have been introduced recently to mitigate stragglers and to speed up GD iterations by assigning redundant computations to workers. In this paper, we consider gradient coding (GC), and propose a novel dynamic GC scheme, which assigns redundant data to workers to acquire the flexibility to dynamically choose from among a set of possible codes depending on the past straggling behavior. In particular, we consider GC with clustering, and regulate the number of stragglers in each cluster by dynamically forming the clusters at each iteration; hence, the proposed scheme is called $GC$ $with$ $dynamic$ $clustering$ (GC-DC). Under a time-correlated straggling behavior, GC-DC gains from adapting to the straggling behavior over time such that, at each iteration, GC-DC aims at distributing the stragglers across clusters as uniformly as possible based on the past straggler behavior. For both homogeneous and heterogeneous worker models, we numerically show that GC-DC provides significant improvements in the average per-iteration completion time without an increase in the communication load compared to the original GC scheme.
翻译:分布式计算技术对于加速大规模机器学习应用至关重要。 分布式梯度下降( GD) 被广泛用于通过在多个工人之间分配数据集来平行学习任务。 在分布式同步 GD 中, 分配式的完成时间的显著绩效瓶颈是$Straggling $ 工人。 最近引入了编码式分布式计算技术, 以缓解分解器, 通过向工人分配多余的计算来加快 GD 迭代。 在本文中, 我们考虑梯度编码( GD), 并提出一个新的动态GC 方案, 向工人分配冗余数据, 以获得灵活性, 以便根据过去摇晃动的行为动态地从一套可能的代码中选择。 特别是, 我们考虑将GC 组合起来, 并通过在每次循环中动态地组成分组来调节每个组的分层数; 因此, 拟议的计算方法需要$GC$, 美元, 和 美元 美元 GC 组合( GC- DC) 。 在与时间相关的折叠行为中, GC- DC 获得从不动式数据, 获得灵活性, 以适应到比相拉动式的每平时的行为, 平时, 平整的C- clasli- 动作, 在每一个 的 的 的 动作, 以显示的 以显示的 的 平整的 的 的周期 以 以 以 以 以 平整 的 的 平整的 平整 的 平整的 的 的 的 。