Federated learning (FL) is one of the most appealing alternatives to the standard centralized learning paradigm, allowing heterogeneous set of devices to train a machine learning model without sharing their raw data. However, FL requires a central server to coordinate the learning process, thus introducing potential scalability and security issues. In the literature, server-less FL approaches like gossip federated learning (GFL) and blockchain-enabled federated learning (BFL) have been proposed to mitigate these issues. In this work, we propose a complete overview of these three techniques proposing a comparison according to an integral set of performance indicators, including model accuracy, time complexity, communication overhead, convergence time and energy consumption. An extensive simulation campaign permits to draw a quantitative analysis. In particular, GFL is able to save the 18% of training time, the 68% of energy and the 51% of data to be shared with respect to the CFL solution, but it is not able to reach the level of accuracy of CFL. On the other hand, BFL represents a viable solution for implementing decentralized learning with a higher level of security, at the cost of an extra energy usage and data sharing. Finally, we identify open issues on the two decentralized federated learning implementations and provide insights on potential extensions and possible research directions on this new research field.
翻译:联邦学习(FL)是标准集中学习模式最有吸引力的替代方法之一,它允许各种设备在不分享原始数据的情况下训练机器学习模式,然而,联邦学习(FL)要求有一个中央服务器来协调学习过程,从而引入潜在的可扩缩性和安全问题。在文献中,没有服务器的FL方法,如八卦联邦学习(GFL)和连锁联邦学习(BFL)等,被提议来缓解这些问题。在这项工作中,我们提议对这三种技术进行全面概述,建议根据一套综合业绩指标进行比较,包括模型准确性、时间复杂性、通信间接费用、聚合时间和能源消耗。广泛的模拟运动允许进行定量分析。特别是,GLL能够节省18%的培训时间、68%的能源以及51%的数据与CFL解决方案共享,但无法达到CLF的准确度。另一方面,BLLL代表了以额外的能源使用和数据共享为代价,以更高的安全程度实施分散式学习的一套可行的解决办法。最后,我们提出了关于分散化研究领域的潜在方向。