Federated learning (FL) is a newly emerged branch of AI that facilitates edge devices to collaboratively train a global machine learning model without centralizing data and with privacy by default. However, despite the remarkable advancement, this paradigm comes with various challenges. Specifically, in large-scale deployments, client heterogeneity is the norm which impacts training quality such as accuracy, fairness, and time. Moreover, energy consumption across these battery-constrained devices is largely unexplored and a limitation for wide-adoption of FL. To address this issue, we develop EAFL, an energy-aware FL selection method that considers energy consumption to maximize the participation of heterogeneous target devices. EAFL is a power-aware training algorithm that cherry-picks clients with higher battery levels in conjunction with its ability to maximize the system efficiency. Our design jointly minimizes the time-to-accuracy and maximizes the remaining on-device battery levels. EAFLimproves the testing model accuracy by up to 85\% and decreases the drop-out of clients by up to 2.45$\times$.
翻译:联邦学习(FL)是AI的一个新兴分支,它便利了边际装置,在不集中数据和默认隐私的情况下合作培训全球机器学习模式,尽管取得了显著的进步,但这一模式也带来了各种挑战。具体地说,在大规模部署中,客户差异性是影响培训质量(如准确性、公正性和时间)的规范。此外,这些电池限制装置的能源消耗基本上没有开发,而且对广泛采用FL有限制。为了解决这个问题,我们开发了一个能见度的FLA,即一种能源消耗法,考虑到能源消耗以最大限度地增加不同目标装置的参与。EFLA是一种能能见度的FL选择法,它考虑到能源消耗以吸引更多不同目标装置的参与。EPCLA是一种能敏化培训算法,在最大系统效率的同时挑选电池水平较高的客户。我们的设计联合将时间-准确性最小化并最大限度地增加其余的脱热电池水平。EAFLIFLI将测试模型的准确性提高到85 ⁇,并将客户的流失率降低至2,45\time$。