Federated learning (FL) has attracted tremendous attentions in recent years due to its privacy preserving measures and great potentials in some distributed but privacy-sensitive applications like finance and health. However, high communication overloads for transmitting high-dimensional networks and extra security masks remains a bottleneck of FL. This paper proposes a communication-efficient FL framework with Adaptive Quantized Gradient (AQG) which adaptively adjusts the quantization level based on local gradient's update to fully utilize the heterogeneousness of local data distribution for reducing unnecessary transmissions. Besides, the client dropout issues are taken into account and the Augmented AQG is developed, which could limit the dropout noise with an appropriate amplification mechanism for transmitted gradients. Theoretical analysis and experiment results show that the proposed AQG leads to 25%-50% of additional transmission reduction as compared to existing popular methods including Quantized Gradient Descent (QGD) and Lazily Aggregated Quantized (LAQ) gradient-based method without deteriorating convergence properties. Particularly, experiments with heterogenous data distributions corroborate a more significant transmission reduction compared with independent identical data distributions. Meanwhile, the proposed AQG is robust to a client dropping rate up to 90% empirically, and the Augmented AQG manages to further improve the FL system's communication efficiency with the presence of moderate-scale client dropouts commonly seen in practical FL scenarios.
翻译:近年来,联邦学习(FL)因其隐私保护措施而吸引了巨大的关注,在金融、卫生等一些分布式但对隐私敏感的应用中也具有巨大的潜力,但是,传送高维网络和额外安全面具的通信超负荷,仍然是FL的瓶颈。 本文提出一个通信高效的FL框架,配有适应性量化梯度(AQG),根据当地梯度更新调整量化水平,充分利用当地数据分布的多样化,减少不必要的传输;此外,还考虑到客户辍学问题,并开发了升级AQG,这可能限制高维网络和额外安全面具传输的升级机制,从而限制辍学的噪音。 理论分析和实验结果表明,与现有流行方法相比,AQG(QG)和L(L)梯度(LAQQ)等混合方法相比,AQQ(QQ)和AL(L)等离值数据分发率的实验,与AQ(AG)客户的正常递增率相比,AQ(G)级分发率和AQ(AG)平均递减。