Proof of work (PoW), as the representative consensus protocol for blockchain, consumes enormous amounts of computation and energy to determine bookkeeping rights among miners but does not achieve any practical purposes. To address the drawback of PoW, we propose a novel energy-recycling consensus mechanism named platform-free proof of federated learning (PF-PoFL), which leverages the computing power originally wasted in solving hard but meaningless PoW puzzles to conduct practical federated learning (FL) tasks. Nevertheless, potential security threats and efficiency concerns may occur due to the untrusted environment and miners' self-interested features. In this paper, by devising a novel block structure, new transaction types, and credit-based incentives, PF-PoFL allows efficient artificial intelligence (AI) task outsourcing, federated mining, model evaluation, and reward distribution in a fully decentralized manner, while resisting spoofing and Sybil attacks. Besides, PF-PoFL equips with a user-level differential privacy mechanism for miners to prevent implicit privacy leakage in training FL models. Furthermore, by considering dynamic miner characteristics (e.g., training samples, non-IID degree, and network delay) under diverse FL tasks, a federation formation game-based mechanism is presented to distributively form the optimized disjoint miner partition structure with Nash-stable convergence. Extensive simulations validate the efficiency and effectiveness of PF-PoFL.
翻译:工作证明(PoW)作为具有代表性的封锁共识协议,消耗了大量计算和精力来确定矿工的簿记权,但没有达到任何实际目的。为解决PoW的缺点,我们提议一个新的能源回收共识机制,名为FP-PoFL, 名为“联合会学习的无平台证明”(PF-PoFL),它利用原本在解决硬的、毫无意义的PoW拼图中浪费的计算能力,开展实际的联结学习任务;然而,由于环境不受信任,矿工自身感兴趣的特点,潜在的安全威胁和效率问题可能发生。在本文件中,通过设计新的块结构、新的交易类型和基于信用的奖励,PFF-PoFL允许以完全分散的方式高效的人工智能任务外包、联合采矿、模型评估和奖励分配。此外,PFF-PFF-PL为矿工提供了一个用户级的差别隐私机制,以防止FLM模型的隐含渗漏。此外,考虑动态矿工特性(例如,培训样本、不多样化的FL-BS-B-BS-S-Bleval Stal Stal Stall Stall Stal Stall AS-FAirleval AS-FAxxxleval-FAxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx