In this paper, we address the problem of privacy-preserving training and evaluation of neural networks in an $N$-party, federated learning setting. We propose a novel system, POSEIDON, the first of its kind in the regime of privacy-preserving neural network training. It employs multiparty lattice-based cryptography to preserve the confidentiality of the training data, the model, and the evaluation data, under a passive-adversary model and collusions between up to $N-1$ parties. To efficiently execute the secure backpropagation algorithm for training neural networks, we provide a generic packing approach that enables Single Instruction, Multiple Data (SIMD) operations on encrypted data. We also introduce arbitrary linear transformations within the cryptographic bootstrapping operation, optimizing the costly cryptographic computations over the parties, and we define a constrained optimization problem for choosing the cryptographic parameters. Our experimental results show that POSEIDON achieves accuracy similar to centralized or decentralized non-private approaches and that its computation and communication overhead scales linearly with the number of parties. POSEIDON trains a 3-layer neural network on the MNIST dataset with 784 features and 60K samples distributed among 10 parties in less than 2 hours.
翻译:在本文中,我们处理的是以美元为一方、联邦式学习环境,对神经网络进行隐私保护培训和评估的问题;我们提出一个新的系统POSEIDON,这是隐私保护神经网络培训制度中首个这类系统,POSEIDON是保护隐私神经网络培训机制中的第一个这类系统;它采用多式拉丁拼字加密法,根据被动反向模式和最多达N-1美元缔约方之间的串通,保护培训数据、模型和评价数据的保密性;为了有效地执行培训神经网络的安全背面分析算法,我们提供了一种通用包装方法,使加密数据能够进行单一指示、多重数据(SIMD)操作;我们还在加密导航系统安装操作中引入任意线性线性转换,优化各方费用昂贵的加密计算;我们界定了选择加密参数的有限优化问题;我们的实验结果表明,POSEIDON实现了类似于集中或分散的非私营方法的准确性,其计算和通信头等标准与缔约方数量线性标。 POSIDON在分布的60小时的样品中用不到60KMIS数据设置的3级10的神经网络。