Deep Neural Networks (DNNs) have achieved remarkable progress in various real-world applications, especially when abundant training data are provided. However, data isolation has become a serious problem currently. Existing works build privacy preserving DNN models from either algorithmic perspective or cryptographic perspective. The former mainly splits the DNN computation graph between data holders or between data holders and server, which demonstrates good scalability but suffers from accuracy loss and potential privacy risks. In contrast, the latter leverages time-consuming cryptographic techniques, which has strong privacy guarantee but poor scalability. In this paper, we propose SPNN - a Scalable and Privacy-preserving deep Neural Network learning framework, from algorithmic-cryptographic co-perspective. From algorithmic perspective, we split the computation graph of DNN models into two parts, i.e., the private data related computations that are performed by data holders and the rest heavy computations that are delegated to a server with high computation ability. From cryptographic perspective, we propose using two types of cryptographic techniques, i.e., secret sharing and homomorphic encryption, for the isolated data holders to conduct private data related computations privately and cooperatively. Furthermore, we implement SPNN in a decentralized setting and introduce user-friendly APIs. Experimental results conducted on real-world datasets demonstrate the superiority of SPNN.
翻译:深神经网络(DNN)在现实世界的各种应用中取得了显著进展,特别是在提供了大量培训数据的情况下。然而,数据隔离目前已成为一个严重问题。现有工作从算法角度或加密角度建立隐私保护 DNN模型。前者主要将DNN计算图在数据持有者之间或数据持有者与服务器之间分割成两部分,表明数据持有者进行的个人数据相关计算,以及委托给具有高计算能力的服务器的其余重计算。从加密角度看,我们建议使用两种加密技术,即秘密共享和保留隐私的深神经网络学习框架,从算法-加密共窥视角度或加密角度进行。从算法角度,我们将DNNNM模型的计算图分为两个部分,即由数据持有者进行的个人数据相关计算,以及委托给具有高计算能力的服务器的其余重计算。从加密角度看,我们建议使用两种类型的加密技术,即,秘密共享和同地保存深神经网络的深神经网络学习框架。从算学角度,我们将DNNNNNNM模型的计算图分为两个部分,我们采用与用户的离散的机密的高级数据操作。