Specializing Directed Acyclic Graph Federated Learning(SDAGFL) is a new federated learning framework which updates model from the devices with similar data distribution through Directed Acyclic Graph Distributed Ledger Technology (DAG-DLT). SDAGFL has the advantage of personalization, resisting single point of failure and poisoning attack in fully decentralized federated learning. Because of these advantages, the SDAGFL is suitable for the federated learning in IoT scenario where the device is usually battery-powered. To promote the application of SDAGFL in IoT, we propose an energy optimized SDAGFL based event-triggered communication mechanism, called ESDAGFL. In ESDAGFL, the new model is broadcasted only when it is significantly changed. We evaluate the ESDAGFL on a clustered synthetically FEMNIST dataset and a dataset from texts by Shakespeare and Goethe's works. The experiment results show that our approach can reduce energy consumption by 33\% compared with SDAGFL, and realize the same balance between training accuracy and specialization as SDAGFL.
翻译:定向环形联盟学习(SDAGFL)是一个新的联合学习框架,它更新了通过定向环形分布式拉皮条技术(DAG-DLT)以类似数据分发设备提供的模型。SDAGFL的优势在于个性化,在完全分散化的联合会学习中抵制单一的失败点和中毒袭击。由于这些优势,SDAGFL适合在IOT情景中进行联合学习,因为该设备通常使用电池。为了促进SDAGFL在IOT中的应用,我们建议采用一个基于SDAGFL的优化SDAGFL事件触发通信机制。在ESDAGFL中,只有在新模式发生重大变化时才播出。我们用一组合成FEMNIST数据集和莎士比和戈德作品的文本数据集对EDADGFEL进行了评估。实验结果表明,我们的方法可以比SDAGFL减少能源消耗量33 ⁇,并实现SDAGFL培训精度和专业化之间的平衡。