Federated learning (FL) collaboratively aggregates a shared global model depending on multiple local clients, while keeping the training data decentralized in order to preserve data privacy. However, standard FL methods ignore the noisy client issue, which may harm the overall performance of the aggregated model. In this paper, we first analyze the noisy client statement, and then model noisy clients with different noise distributions (e.g., Bernoulli and truncated Gaussian distributions). To learn with noisy clients, we propose a simple yet effective FL framework, named Federated Noisy Client Learning (Fed-NCL), which is a plug-and-play algorithm and contains two main components: a data quality measurement (DQM) to dynamically quantify the data quality of each participating client, and a noise robust aggregation (NRA) to adaptively aggregate the local models of each client by jointly considering the amount of local training data and the data quality of each client. Our Fed-NCL can be easily applied in any standard FL workflow to handle the noisy client issue. Experimental results on various datasets demonstrate that our algorithm boosts the performances of different state-of-the-art systems with noisy clients.
翻译:联邦学习联合会(FL)根据多个当地客户的情况合作整合一个共享的全球模式,同时保持培训数据分散,以维护数据隐私。然而,标准的FL方法忽略了繁忙的客户问题,这可能会损害综合模型的整体性能。在本文件中,我们首先分析吵闹的客户声明,然后以不同噪音分布(例如Bernoulli和Trunced Gaussian发行)来模拟吵闹的客户。为了与吵闹的客户学习,我们提议了一个简单而有效的FL框架,名为Freed Noisy客户学习(Fed-NCL),称为Freed Noisy客户学习(Fed-NCL),这是一个插接和游戏算法,包含两个主要组成部分:一个数据质量计量(DQM),以动态方式量化每个参与的客户的数据质量,以及一个噪音强大的聚合(NRA),以适应性地汇总每个客户的当地模式,共同考虑当地培训数据的数量和每个客户的数据质量。我们的FD-NCL可以很容易在任何标准的FL工作流程中应用,以处理紧张的客户问题。在各种数据集上的实验结果表明,我们的算算得上是不同的州客户的紧张性。