Federated learning (FL) is an emerging technique used to collaboratively train a global machine learning model while keeping the data localized on the user devices. The main obstacle to FL's practical implementation is the Non-Independent and Identical (Non-IID) data distribution across users, which slows convergence and degrades performance. To tackle this fundamental issue, we propose a method (ComFed) that enhances the whole training process on both the client and server sides. The key idea of ComFed is to simultaneously utilize client-variance reduction techniques to facilitate server aggregation and global adaptive update techniques to accelerate learning. Our experiments on the Cifar-10 classification task show that ComFed can improve state-of-the-art algorithms dedicated to Non-IID data.
翻译:联邦学习(FL)是一种新兴技术,用于合作培训全球机器学习模式,同时将数据保留在用户设备上,而FL的实际实施的主要障碍是用户之间非独立和同同(非IID)数据分布,这减缓了趋同和降低性能。为了解决这一根本问题,我们建议一种方法(ComFed),加强客户和服务器方面的整个培训过程。ComFed的主要想法是同时利用客户差异减少技术,促进服务器汇总和全球适应性更新技术,以加速学习。我们在Cifar-10分类任务上的实验显示,ComFed可以改进专门用于非IID数据的最新算法。