With the booming deployment of Internet of Things, health monitoring applications have gradually prospered. Within the recent COVID-19 pandemic situation, interest in permanent remote health monitoring solutions has raised, targeting to reduce contact and preserve the limited medical resources. Among the technological methods to realize efficient remote health monitoring, federated learning (FL) has drawn particular attention due to its robustness in preserving data privacy. However, FL can yield to high communication costs, due to frequent transmissions between the FL server and clients. To tackle this problem, we propose in this paper a communication-efficient federated learning (CEFL) framework that involves clients clustering and transfer learning. First, we propose to group clients through the calculation of similarity factors, based on the neural networks characteristics. Then, a representative client in each cluster is selected to be the leader of the cluster. Differently from the conventional FL, our method performs FL training only among the cluster leaders. Subsequently, transfer learning is adopted by the leader to update its cluster members with the trained FL model. Finally, each member fine-tunes the received model with its own data. To further reduce the communication costs, we opt for a partial-layer FL aggregation approach. This method suggests partially updating the neural network model rather than fully. Through experiments, we show that CEFL can save up to to 98.45% in communication costs while conceding less than 3% in accuracy loss, when compared to the conventional FL. Finally, CEFL demonstrates a high accuracy for clients with small or unbalanced datasets.
翻译:随着互联网的兴起,健康监测应用逐渐兴旺。在最近的COVID-19大流行情况下,对长期远程健康监测解决方案的兴趣已经提高,目标是减少接触并保存有限的医疗资源。在技术方法中,实现高效远程健康监测的技术方法中,联邦学习(FL)因其在保护数据隐私方面的强力而引起特别关注。然而,FL可以降低高昂的通信成本,因为FL服务器和客户之间经常传输。为了解决这一问题,我们在本文件中提议建立一个通信效率高的联邦学习框架,其中涉及客户群集和传输学习。首先,我们提议根据神经网络的特点,通过计算相似因素,将客户分组。随后,每个组中的代表客户被选定为该组的领导者,不同于传统的FL,我们的方法只对FL进行培训。随后,领导人采用经培训的FL模式更新其集群成员。最后,每个成员用自己的数据对收到的CLFL模型进行精细的调整。为了进一步降低通信的准确性系数,我们选择在常规网络中以部分成本来显示成本。我们选择的是部分的CLFL方法,最后显示的是成本,而不是部分的递增成本。通过CL方法。我们最后显示CFLFL方法,通过递化的计算。通过一种成本方法,在最后显示部分成本。我们选择了成本方法来显示部分的计算。