With the vigorous development of artificial intelligence technology, various engineering technology applications have been implemented one after another. The gradient descent method plays an important role in solving various optimization problems, due to its simple structure, good stability and easy implementation. In multi-node machine learning system, the gradients usually need to be shared. Data reconstruction attacks can reconstruct training data simply by knowing the gradient information. In this paper, to prevent gradient leakage while keeping the accuracy of model, we propose the super stochastic gradient descent approach to update parameters by concealing the modulus length of gradient vectors and converting it or them into a unit vector. Furthermore, we analyze the security of stochastic gradient descent approach. Experiment results show that our approach is obviously superior to prevalent gradient descent approaches in terms of accuracy and robustness.
翻译:随着人工智能技术的蓬勃发展,各种工程技术应用相继得到应用。由于梯度下降法的结构简单、稳定且实施简便,因此在解决各种优化问题方面起着重要作用。在多节机器学习系统中,梯度通常需要共享。数据重建攻击可以仅仅通过了解梯度信息来重建培训数据。在本文中,为了防止梯度渗漏,同时保持模型的准确性,我们建议采用超级随机梯度梯度下降法,通过隐藏梯度矢量的模量长度,将其转换成一个单位矢量来更新参数。此外,我们分析了随机梯度梯度下降方法的安全性。实验结果表明,从准确性和稳健性来看,我们的方法显然优于普遍的梯度梯度下降方法。