Neural networks, with the capability to provide efficient predictive models, have been widely used in medical, financial, and other fields, bringing great convenience to our lives. However, the high accuracy of the model requires a large amount of data from multiple parties, raising public concerns about privacy. Privacy-preserving neural network based on multi-party computation is one of the current methods used to provide model training and inference under the premise of solving data privacy. In this study, we propose a new two-party privacy-preserving neural network training and inference framework in which privacy data is distributed to two non-colluding servers. We construct a preprocessing protocol for mask generation, support and realize secret sharing comparison on 2PC, and propose a new method to further reduce the communication rounds. Based on the comparison protocol, we construct building blocks such as division and exponential, and realize the process of training and inference that no longer needs to convert between different types of secret sharings and is entirely based on arithmetic secret sharing. Compared with the previous works, our work obtains higher accuracy, which is very close to that of plaintext training. While the accuracy has been improved, the runtime is reduced, considering the online phase, our work is 5x faster than SecureML, 4.32-5.75x faster than SecureNN, and is very close to the current optimal 3PC implementation, FALCON. For secure inference, as far as known knowledge is concerned, we should be the current optimal 2PC implementation, which is 4-358x faster than other works.
翻译:具有提供高效预测模型能力的神经网络已经广泛用于医疗、财务和其他领域,为我们的生活提供了极大的便利;然而,由于模型的高度精确性,需要多方提供大量数据,引起公众对隐私的关切;基于多方计算的隐私保护神经网络是目前用来在解决数据隐私前提下提供模型培训和推断的方法之一;在本研究中,我们建议采用新的双方隐私保护神经网络培训和推断框架,将隐私数据更快地分发给两个非循环服务器;我们为2PC制作面具、支持和实现秘密共享比较设计了一个预处理协议,并提出了进一步减少通信回合的新方法;根据比较协议,我们建造了诸如分化和指数式等建筑构件,并实现了培训和推论,即不再需要在不同类型的秘密共享之间转换,而是完全基于计算保密共享。与以往的工作相比,我们的工作获得了更高的准确性,这非常接近于2PC的生成、支持和实现秘密共享比较, 并且提出了一个新的方法。 以比较准确性在2AL-3阶段,我们的安全性运行时间比目前安全性4运行的速度要快。