Federated Learning (FL) is a method of training machine learning models on private data distributed over a large number of possibly heterogeneous clients such as mobile phones and IoT devices. In this work, we propose a new federated learning framework named HeteroFL to address heterogeneous clients equipped with very different computation and communication capabilities. Our solution can enable the training of heterogeneous local models with varying computation complexities and still produce a single global inference model. For the first time, our method challenges the underlying assumption of existing work that local models have to share the same architecture as the global model. We demonstrate several strategies to enhance FL training and conduct extensive empirical evaluations, including five computation complexity levels of three model architecture on three datasets. We show that adaptively distributing subnetworks according to clients' capabilities is both computation and communication efficient.
翻译:联邦学习(FL)是培训私人数据机器学习模型的一种方法,在移动电话和IoT设备等大量可能多种多样的客户中传播。在这项工作中,我们提议一个新的名为HeteroFL的联邦学习框架,针对具有非常不同的计算和通信能力的多样化客户。我们的解决方案可以培训具有不同计算复杂性的多样化本地模型,并仍然产生单一的全球推理模型。我们的方法首次挑战了现有工作的基本假设,即当地模型必须共享全球模型的同一结构。我们展示了加强FL培训并进行广泛经验评估的若干战略,包括三个数据集三个模型结构的五个计算复杂程度。我们显示,根据客户能力适应性分配子网络既能进行计算,又能高效通信。