Since the concern of privacy leakage extremely discourages user participation in sharing data, federated learning has gradually become a promising technique for both academia and industry for achieving collaborative learning without leaking information about the local data. Unfortunately, most federated learning solutions cannot efficiently verify the execution of each participant's local machine learning model and protect the privacy of user data, simultaneously. In this article, we first propose a Zero-Knowledge Proof-based Federated Learning (ZKP-FL) scheme on blockchain. It leverages zero-knowledge proof for both the computation of local data and the aggregation of local model parameters, aiming to verify the computation process without requiring the plaintext of the local data. We further propose a Practical ZKP-FL (PZKP-FL) scheme to support fraction and non-linear operations. Specifically, we explore a Fraction-Integer mapping function, and use Taylor expansion to efficiently handle non-linear operations while maintaining the accuracy of the federated learning model. We also analyze the security of PZKP-FL. Performance analysis demonstrates that the whole running time of the PZKP-FL scheme is approximately less than one minute in parallel execution.
翻译:由于隐私泄漏的担忧极大地阻碍了用户共享数据的参与度,联邦学习逐渐成为一种有前途的技术,既可以在不泄露本地数据信息的情况下实现协同学习,又可以保护用户数据隐私。不幸的是,大多数联邦学习方案无法高效验证每个参与方的本地机器学习模型的执行方式并保护用户数据隐私。在本文中,我们首先提出了一个基于区块链的零知识证明联邦学习(ZKP-FL)方案。它利用零知识证明来验证本地数据的计算和本地模型参数的聚合,以验证计算过程而不要求明文本地数据。我们进一步提出了实用的零知识证明联邦学习(PZKP-FL)方案,以支持分数和非线性操作。具体来说,我们探索了一种分数-整数映射函数,并使用泰勒展开来有效处理非线性操作,同时保持联邦学习模型的准确性。我们还分析了PZKP-FL的安全性。性能分析表明,PZKP-FL方案的整个运行时间在并行执行下约为一分钟。