Distributed machine learning (ML) can bring more computational resources to bear than single-machine learning, thus enabling reductions in training time. Distributed learning partitions models and data over many machines, allowing model and dataset sizes beyond the available compute power and memory of a single machine. In practice though, distributed ML is challenging when distribution is mandatory, rather than chosen by the practitioner. In such scenarios, data could unavoidably be separated among workers due to limited memory capacity per worker or even because of data privacy issues. There, existing distributed methods will utterly fail due to dominant transfer costs across workers, or do not even apply. We propose a new approach to distributed fully connected neural network learning, called independent subnet training (IST), to handle these cases. In IST, the original network is decomposed into a set of narrow subnetworks with the same depth. These subnetworks are then trained locally before parameters are exchanged to produce new subnets and the training cycle repeats. Such a naturally "model parallel" approach limits memory usage by storing only a portion of network parameters on each device. Additionally, no requirements exist for sharing data between workers (i.e., subnet training is local and independent) and communication volume and frequency are reduced by decomposing the original network into independent subnets. These properties of IST can cope with issues due to distributed data, slow interconnects, or limited device memory, making IST a suitable approach for cases of mandatory distribution. We show experimentally that IST results in training times that are much lower than common distributed learning approaches.
翻译:分布式机器学习(ML) 能够带来比单机学习更多的计算资源,比单机学习可以带来更多的计算资源,从而能够减少培训时间。 分布式学习分区模型和数据在许多机器上分布, 允许模型和数据集大小超过单一机器的计算功率和内存。 实际上, 分配式ML在强制分配而不是由执业者选择时具有挑战性。 在这种情况下, 由于每个工人的记忆能力有限, 或甚至由于数据隐私问题, 数据可能不可避免地在工人之间分离。 在那里, 现有的分配方法将完全失败, 因为工人之间的主要转移费用, 或甚至不适用。 我们提出了一种新的方法, 将完全相连的神经网络学习、 称为独立的子网络培训( IST) 配置新的网络, 原始网络的存储时间和网络的存储速度有限, 这些系统内部网络的存储速度和网络的运行速度有限。