The proliferation of resourceful mobile devices that store rich, multidimensional and privacy-sensitive user data motivate the design of federated learning (FL), a machine-learning (ML) paradigm that enables mobile devices to produce an ML model without sharing their data. However, the majority of the existing FL frameworks rely on centralized entities. In this work, we introduce IPLS, a fully decentralized federated learning framework that is partially based on the interplanetary file system (IPFS). By using IPLS and connecting into the corresponding private IPFS network, any party can initiate the training process of an ML model or join an ongoing training process that has already been started by another party. IPLS scales with the number of participants, is robust against intermittent connectivity and dynamic participant departures/arrivals, requires minimal resources, and guarantees that the accuracy of the trained model quickly converges to that of a centralized FL framework with an accuracy drop of less than one per thousand.
翻译:储存丰富、多维和对隐私敏感的用户数据的资源丰富的移动设备激增,促使设计联合学习(FL),这是一种机器学习(ML)模式,使移动设备能够在不分享数据的情况下生成ML模型,然而,现有的FL框架大多依赖集中实体。在这项工作中,我们引入了完全分散的IPLS,即部分基于行星间文件系统的完全分散的联结学习框架。通过使用IPLS和连接相应的私人GIS网络,任何一方都可以启动ML模型的培训进程,或者加入另一个缔约方已经启动的正在进行的培训进程。IPLS与参与者人数相比,具有很强的比重,可以防止间歇性连接和动态参与者离开/抵达,需要最低限度的资源,并保证所培训模型的准确性能很快与中央FL框架的精确率一致,每千人下降不到1。