Federated Learning (FL) is a recent development in the field of machine learning that collaboratively trains models without the training data leaving client devices, to preserve data privacy. In realistic FL settings, the training set is distributed over clients in a highly non-Independent and Identically Distributed (non-IID) fashion, which has been shown extensively to harm FL convergence speed and final model performance. To address this challenge, we propose a novel, generalised approach for incorporating adaptive optimisation techniques into FL with the Federated Global Biased Optimiser (FedGBO) algorithm. FedGBO accelerates FL by employing a set of global biased optimiser values during the client-training phase, which helps to reduce `client-drift' from non-IID data, whilst also benefiting from adaptive optimisation. We show that the FedGBO update with a generic optimiser can be reformulated as centralised training using biased gradients and optimiser updates, and apply this theoretical framework to prove the convergence of FedGBO using momentum-Stochastic Gradient Descent (SGDm). We also conduct extensive experiments using 4 realistic FL benchmark datasets (CIFAR100, Sent140, FEMNIST, Shakespeare) and 3 popular adaptive optimisers (RMSProp, SGDm, Adam) to compare the performance of state-of-the-art adaptive-FL algorithms. The results demonstrate that FedGBO has highly competitive performance whilst achieving lower communication and computation costs, and provide practical insights into the trade-offs associated with the different adaptive-FL algorithms and optimisers for real-world FL deployments.
翻译:联邦学习联盟(FL)是机器学习领域的一项最新发展,即合作培训模型,而没有培训数据,客户设备则留下数据隐私。在现实的FL环境中,培训组合以高度非独立和同分布(非IID)方式在客户中分布,这已广泛证明损害FL趋同速度和最后模型性能。为了应对这一挑战,我们提出了一个创新的、普遍化的方法,将适应性优化技术纳入FL,与FedGBO全球双向优化算法(FedGBO)的算法相结合。 FedGBO在客户培训阶段使用一套全球有偏差的、有偏差的、有偏差的、有偏差的、没有相同的分配(FDML)模式,在客户培训阶段使用一套全球有偏差的、有偏差的、有偏差的、没有相同的分配(SGDML)的选项,同时受益于适应性能优化。我们表明,FDGO的更新可以作为集中化培训,使用偏差梯和最佳的更新,并应用这一理论框架来证明FGBO的趋同具有动力-C-直位(SG-RDL-RD-RD-al-al-revental-ral-al-al-ral-al-al-al-al-al-al-al-al-albislational-FM) Ex-al-ald-ald-FSB-FS、我们(S-FS-FSB-FSBAR-FSD-FSD-FSBSBS-FSD-FS-FS-S-S-S-S-FS-S-S-S-FS-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-FD-S-S-S-SD-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-