The concept of Federated Learning has emerged as a convergence of distributed machine learning, information, and communication technology. It is vital to the development of distributed machine learning, which is expected to be fully decentralized, robust, communication efficient, and secure. However, the federated learning settings with a central server can't meet requirements in fully decentralized networks. In this paper, we propose a fully decentralized, efficient, and privacy-preserving global model training protocol, named PPT, for federated learning in Peer-to-peer (P2P) Networks. PPT uses a one-hop communication form to aggregate local model update parameters and adopts the symmetric cryptosystem to ensure security. It is worth mentioning that PPT modifies the Eschenauer-Gligor (E-G) scheme to distribute keys for encryption. PPT also adopts Neighborhood Broadcast, Supervision and Report, and Termination as complementary mechanisms to enhance security and robustness. Through extensive analysis, we demonstrate that PPT resists various security threats and preserve user privacy. Ingenious experiments demonstrate the utility and efficiency as well.
翻译:联邦学习的概念已成为分布式机器学习、信息和通信技术的趋同,对于开发分布式机器学习至关重要,这种学习预计将完全分散、稳健、通信高效和安全;然而,中央服务器的联邦学习环境无法满足完全分散式网络的要求;在本文件中,我们提议建立一个完全分散、高效和隐私保护的全球示范培训协议,名为PPPT,用于在同侪网络(P2P)中进行联合学习;PPPT使用一手式通信表格,以汇总地方模型更新参数,采用对称密码系统确保安全;值得一提的是,PPT修改了Eschenauer-Gligor(E-G)系统,以分配加密钥匙;PPT还采用Nieghborhood广播、监督和报告,以及终止作为加强安全和稳健的补充机制;我们通过广泛分析,证明PPT抵制各种安全威胁,保护用户隐私;创造性实验也显示了效用和效率。