Petabytes of data are generated each day by emerging Internet of Things (IoT), but only few of them can be finally collected and used for Machine Learning (ML) purposes due to the apprehension of data & privacy leakage, which seriously retarding ML's growth. To alleviate this problem, Federated learning is proposed to perform model training by multiple clients' combined data without the dataset sharing within the cluster. Nevertheless, federated learning introduces massive communication overhead as the synchronized data in each epoch is of the same size as the model, and thereby leading to a low communication efficiency. Consequently, variant methods mainly focusing on the communication rounds reduction and data compression are proposed to reduce the communication overhead of federated learning. In this paper, we propose Overlap-FedAvg, a framework that parallels the model training phase with model uploading & downloading phase, so that the latter phase can be totally covered by the former phase. Compared to vanilla FedAvg, Overlap-FedAvg is further developed with a hierarchical computing strategy, a data compensation mechanism and a nesterov accelerated gradients~(NAG) algorithm. Besides, Overlap-FedAvg is orthogonal to many other compression methods so that they can be applied together to maximize the utilization of the cluster. Furthermore, the theoretical analysis is provided to prove the convergence of the proposed Overlap-FedAvg framework. Extensive experiments on both conventional and recurrent tasks with multiple models and datasets also demonstrate that the proposed Overlap-FedAvg framework substantially boosts the federated learning process.
翻译:由于对数据和隐私渗漏的担忧,使得数据和隐私渗漏严重阻碍了ML的成长,因此最终可以收集和用于机器学习(ML)的只有很少的数据。为了缓解这一问题,建议联邦学习通过多个客户的综合数据进行模型培训,而没有在集群内共享数据集。然而,联邦学习引入了巨大的通信间接费用,因为每个角落的同步数据与模型相同,从而导致通信效率低。因此,提议了主要侧重于通信周期减少和数据压缩的变式方法,以降低联结学习的通信间接费用。在本文件中,我们提议了一个框架,与模型培训阶段平行的多个客户合并数据,而没有在集群内共享数据集。 与Vanilla FedAvg比较, 超频Avg进一步开发了一个等级化计算战略,数据补偿机制以及加速的多频梯流和数据压缩模型。此外,A 超频集-Flormax 和Flational-Flational-Floral-lational-Fral-lational-lational-lational-Supal-Orationserv-laft laft laft-labal-laft-lab-laft-laft-lab-lab-so-lab-lab-so-s-lax-laft-laft-s-lax-lax-lax-luplups-lupd-lupd-lad-lad-lad-lad-lad-lax-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lax-lad-lad-lax-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lax-lax-la-lax-la-la-lax-lax-la-lax-