Privacy and communication efficiency are important challenges in federated training of neural networks, and combining them is still an open problem. In this work, we develop a method that unifies highly compressed communication and differential privacy (DP). We introduce a compression technique based on Relative Entropy Coding (REC) to the federated setting. With a minor modification to REC, we obtain a provably differentially private learning algorithm, DP-REC, and show how to compute its privacy guarantees. Our experiments demonstrate that DP-REC drastically reduces communication costs while providing privacy guarantees comparable to the state-of-the-art.
翻译:隐私和通信效率是联合培训神经网络的重要挑战,而将神经网络结合起来仍然是一个尚未解决的问题。 在这项工作中,我们开发了一种统一高度压缩通信和差异隐私的方法(DP ) 。 我们引入了一种基于相对通量编码(REC)的压缩技术(REC ) 至联合环境。 在对REC稍作修改后,我们获得了一种差别很大的私人学习算法( DP-REC ), 并展示了如何计算其隐私保障。 我们的实验表明, DP-REC 大幅降低了通信成本,同时提供了类似于最新技术的隐私保障。