Federated learning (FL) is a distributed and privacy-preserving learning framework for predictive modeling with massive data generated at the edge by Internet of Things (IoT) devices. One major challenge preventing the wide adoption of FL in IoT is the pervasive power supply constraints of IoT devices due to the intensive energy consumption of battery-powered clients for local training and model updates. Low battery levels of clients eventually lead to their early dropouts from edge networks, loss of training data jeopardizing the performance of FL, and their availability to perform other designated tasks. In this paper, we propose FedLE, an energy-efficient client selection framework that enables lifespan extension of edge IoT networks. In FedLE, the clients first run for a minimum epoch to generate their local model update. The models are partially uploaded to the server for calculating similarities between each pair of clients. Clustering is performed against these client pairs to identify those with similar model distributions. In each round, low-powered clients have a lower probability of being selected, delaying the draining of their batteries. Empirical studies show that FedLE outperforms baselines on benchmark datasets and lasts more training rounds than FedAvg with battery power constraints.
翻译:联邦学习(FL)是一个分布式和保密的学习框架,用于预测由Things(IoT)装置互联网边缘产生的大量数据,在预测性模型中采用大量数据。阻碍在IoT广泛采用FL的一个重大挑战是,由于电池驱动客户在本地培训和模型更新方面的密集能源消耗,IoT装置的电力供应普遍受限。客户的低电池水平最终导致他们从边缘网络中过早辍学,培训数据丢失,危及FL的性能,以及他们可以执行其他指定任务。在本文中,我们提议FedLE,即节能客户选择框架,使边缘IoT网络的寿命延长。在FedLE,客户首先运行一个最小的点,以产生本地模型更新。模型部分上传到服务器,以计算每对客户的相似性。对这些客户的配对进行分组,以识别类似的模型分布。在每轮中,低动力客户被选中的可能性较低,推迟电池的排流。Empricalalalal 研究表明,FedLEDLE超出FDA限制基准基准基线,最后一轮培训比FedAv。