Nowadays, Deep Neural Networks are widely applied to various domains. However, massive data collection required for deep neural network reveals the potential privacy issues and also consumes large mounts of communication bandwidth. To address these problems, we propose a privacy-preserving method for the federated learning distributed system, operated on Intel Software Guard Extensions, a set of instructions that increase the security of application code and data. Meanwhile, the encrypted models make the transmission overhead larger. Hence, we reduce the commutation cost by sparsification and it can achieve reasonable accuracy with different model architectures.
翻译:目前,深神经网络广泛应用于各个领域,但深神经网络所需的大量数据收集揭示了潜在的隐私问题,也消耗了大量通信带宽。为了解决这些问题,我们提议为联合教学分布系统提供一种保护隐私的方法,该系统在Intel软件保护扩展系统上运行,这是一套加强应用代码和数据安全的指令。与此同时,加密模型使传输管理费用增加。因此,我们通过过滤降低转换成本,并可以实现不同模型结构的合理准确性。