Secure multi-party computation enables multiple mutually distrusting parties to perform computations on data without revealing the data itself, and has become one of the core technologies behind privacy-preserving machine learning. In this work, we present several improved privacy-preserving protocols for both linear and non-linear layers in machine learning. For linear layers, we present an extended beaver triple protocol for bilinear maps that significantly reduces communication of convolution layer. For non-linear layers, we introduce novel protocols for computing the sigmoid and softmax function. Both functions are essential building blocks for machine learning training of classification tasks. Our protocols are both more scalable and robust than prior constructions, and improves runtime performance by 3-17x. Finally, we introduce Morse-STF, an end-to-end privacy-preserving system for machine learning training that leverages all these improved protocols. Our system achieves a 1.8x speedup on logistic regression and 3.9-4.9x speedup on convolutional neural networks compared to prior state-of-the-art systems.
翻译:安全的多党计算使多个互不信任的各方能够在不披露数据本身的情况下进行数据计算,并已成为保护隐私机器学习的核心技术之一。 在这项工作中,我们为机器学习中的线性和非线性层提出了几项改进的隐私保护协议。对于线性层,我们为双线性图提出了一个扩大的海狸三重协议,大大降低了卷土层的通信。对于非线性层,我们引入了新的协议,用于计算浮质和软体功能。这两个功能都是机器学习分类任务培训的基本构件。我们的协议比先前的建筑更加可伸缩和坚固,并且提高了3-17x的运行时间性能。最后,我们引入了Morse-STF,一个端到端的隐私保护系统,用于利用所有这些改进后的协议进行机器学习培训。我们的系统在逻辑回归方面实现了1.8x速度,与以前的状态系统相比,共生神经网络实现了3.9-4.9x速度。