Distributed deep learning (DDL) training systems are designed for cloud and data-center environments that assumes homogeneous compute resources, high network bandwidth, sufficient memory and storage, as well as independent and identically distributed (IID) data across all nodes. However, these assumptions don't necessarily apply on the edge, especially when training neural networks on streaming data in an online manner. Computing on the edge suffers from both systems and statistical heterogeneity. Systems heterogeneity is attributed to differences in compute resources and bandwidth specific to each device, while statistical heterogeneity comes from unbalanced and skewed data on the edge. Different streaming-rates among devices can be another source of heterogeneity when dealing with streaming data. If the streaming rate is lower than training batch-size, device needs to wait until enough samples have streamed in before performing a single iteration of stochastic gradient descent (SGD). Thus, low-volume streams act like stragglers slowing down devices with high-volume streams in synchronous training. On the other hand, data can accumulate quickly in the buffer if the streaming rate is too high and the devices can't train at line-rate. In this paper, we introduce ScaDLES to efficiently train on streaming data at the edge in an online fashion, while also addressing the challenges of limited bandwidth and training with non-IID data. We empirically show that ScaDLES converges up to 3.29 times faster compared to conventional distributed SGD.
翻译:分布式深学习( DDL) 培训系统是为云层和数据中心环境设计的,这些云层和数据中心环境假定了所有节点的同质计算资源、高网络带宽、足够的内存和存储,以及独立和同样分布的(IID)数据。然而,这些假设并不一定适用于边缘,特别是当培训流数据神经网络时,这些假设不一定适用于边缘,特别是在在线数据流培训神经网络时。边缘的计算机既受系统的影响,也受统计异质性差异的影响。系统异质性归因于每个设备特有的计算资源和带宽差异,而统计异异质性来自边缘的不平衡和扭曲数据。在处理流数据时,不同设备之间的流率可能是另一个异质源。如果流速率低于培训批量,设备需要等待足够样本流,然后进行单一的常规梯度梯度下降(SGD)。因此,低量流会像递减递增速度,在同步训练中,高流流流流流流的功能来自偏差数据流流,而在另一手,数据流的缓冲中,数据流会迅速累积,我们可以将数据流到直流到直流中,在直流中,在直流中显示流中,我们直流中,直流中,在直流中,直流中,在直流中,直流中,在直径列列列列中,我们列列中,直流中,在直路路路流中,我们列列中,我们列中,我们列中,在直路流中,我们到直路流中,我们到直路流中,我们列中,直到直到直到直到直路,我们列,在直列,在列,我们列,在直到直到直到直到直列,直列中,直路路路路路路路。