In the last decade, Federated Learning (FL) has gained relevance in training collaborative models without sharing sensitive data. Since its birth, Centralized FL (CFL) has been the most common approach in the literature, where a unique entity creates global models. However, using a centralized approach has the disadvantages of bottleneck at the server node, single point of failure, and trust needs. Decentralized Federated Learning (DFL) arose to solve these aspects by embracing the principles of data sharing minimization and decentralized model aggregation without relying on centralized architectures. However, despite the work done in DFL, the literature has not (i) studied the main fundamentals differentiating DFL and CFL; (ii) reviewed application scenarios and solutions using DFL; and (iii) analyzed DFL frameworks to create and evaluate new solutions. To this end, this article identifies and analyzes the main fundamentals of DFL in terms of federation architectures, topologies, communication mechanisms, security approaches, and key performance indicators. Additionally, the paper at hand explores existing mechanisms to optimize critical DFL fundamentals. Then, this work analyzes and compares the most used DFL application scenarios and solutions according to the fundamentals previously defined. After that, the most relevant features of the current DFL frameworks are reviewed and compared. Finally, the evolution of existing DFL solutions is analyzed to provide a list of trends, lessons learned, and open challenges.
翻译:过去十年来,联邦学习联合会(FL)在不共享敏感数据的情况下,在培训合作模式中变得具有相关性;自创建以来,中央化FL(CFL)一直是文献中最常见的方法,一个独特的实体创建全球模式;然而,采用中央化方法具有服务器节点瓶颈、单一失败点和信任需要等弊端;分散化联邦学习联合会(DFL)在不依赖中央架构的情况下,接受数据共享原则,最大限度地减少数据并分散模式汇总,以解决这些方面的问题;然而,尽管在DFL做了工作,文献并没有(一)研究区别DFL和CFL的主要基本原理;(二)利用DFL审查应用情景和解决方案;以及(三)分析DFL框架,以创建和评价新的解决方案。为此,本文章确定并分析了DFLFL在联邦架构、表、通信机制、安全方法和主要业绩指标方面的主要基本原理。此外,手头的文件探讨了优化DFLL基本基本原理的现有机制。随后,分析并比较了目前最常用的DFL应用情景和解决方案的最新特点。