Federated Learning (FL) has received a significant amount of attention in the industry and research community due to its capability of keeping data on local devices. To aggregate the gradients of local models to train the global model, existing works require that the global model and the local models are the same. However, Internet of Things (IoT) devices are inherently diverse regarding computation speed and onboard memory. In this paper, we propose an FL framework targeting the heterogeneity of IoT devices. Specifically, local models are compressed from the global model, and the gradients of the compressed local models are used to update the global model. We conduct preliminary experiments to illustrate that our framework can facilitate the design of IoT-aware FL.
翻译:联邦学习联合会(FL)由于有能力保存当地设备的数据,在工业和研究界受到大量关注。为了汇总当地模型的梯度以培训全球模型,现有工作要求全球模型和当地模型相同。然而,物联网装置在计算速度和机内内记忆方面本质上是多种多样的。我们在本文件中提议了一个针对IOT设备的异质性的FL框架。具体地说,从全球模型中压缩了当地模型,并使用压缩当地模型的梯度来更新全球模型。我们进行了初步实验,以说明我们的框架可以促进IoT-aware FL的设计。