Nowadays, the development of information technology is growing rapidly. In the big data era, the privacy of personal information has been more pronounced. The major challenge is to find a way to guarantee that sensitive personal information is not disclosed while data is published and analyzed. Centralized differential privacy is established on the assumption of a trusted third-party data curator. However, this assumption is not always true in reality. As a new privacy preservation model, local differential privacy has relatively strong privacy guarantees. Although federated learning has relatively been a privacy-preserving approach for distributed learning, it still introduces various privacy concerns. To avoid privacy threats and reduce communication costs, in this article, we propose integrating federated learning and local differential privacy with momentum gradient descent to improve the performance of machine learning models.
翻译:目前,信息技术的发展正在迅速发展。在大数据时代,个人信息的隐私更加明显。主要的挑战是如何在公布和分析数据时保证敏感个人信息不被披露。中央化的隐私根据受信任的第三方数据管理员的假设而建立。然而,这一假设在现实中并不总是真实的。作为新的隐私保护模式,地方差异隐私有相对强大的隐私保障。尽管联邦化学习相对而言是分散学习的隐私保护方法,但它仍然引入了各种隐私问题。为了避免隐私威胁和降低通信成本,在本篇文章中,我们提议将联邦化学习和地方差异隐私与动力梯度下降结合起来,以改善机器学习模式的性能。