Federated learning (FL) is a distributed method to train a global model over a set of local clients while keeping data localized. It reduces the risks of privacy and security but faces important challenges including expensive communication costs and client drift issues. To address these issues, we propose FedElasticNet, a communication-efficient and drift-robust FL framework leveraging the elastic net. It repurposes two types of the elastic net regularizers (i.e., $\ell_1$ and $\ell_2$ penalties on the local model updates): (1) the $\ell_1$-norm regularizer sparsifies the local updates to reduce the communication costs and (2) the $\ell_2$-norm regularizer resolves the client drift problem by limiting the impact of drifting local updates due to data heterogeneity. FedElasticNet is a general framework for FL; hence, without additional costs, it can be integrated into prior FL techniques, e.g., FedAvg, FedProx, SCAFFOLD, and FedDyn. We show that our framework effectively resolves the communication cost and client drift problems simultaneously.
翻译:联邦学习(FL)是一种分布式方法,用于对一组当地客户进行全球模式培训,同时保持数据本地化,减少隐私和安全风险,但面临重大挑战,包括昂贵的通信成本和客户漂移问题。为了解决这些问题,我们提议FedElasticNet,一个利用弹性网的通信高效和漂浮的FL框架;重新使用两种弹性网管理器(即$_1美元和当地模型更新的罚款$2美元):(1) 美元-诺尔姆调制器对当地更新进行普及,以减少通信成本和客户漂移问题;(2) 美元-2美元调制器通过限制数据异常导致的本地漂移影响来解决客户漂移问题。 FedElaticNet是FL的总框架;因此,在不增加费用的情况下,可以将其纳入以前的FL技术,例如FedAvg、FedProx、SCAFFFFFFOLD和FDDyn。 我们表明,我们的框架有效地解决了通信成本和客户漂移问题。