Logistic regression training over encrypted data has been an attractive idea to security concerns for years. In this paper, we propose a faster gradient variant called $\texttt{quadratic gradient}$ to implement logistic regression training in a homomorphic encryption domain, the core of which can be seen as an extension of the simplified fixed Hessian. We enhance Nesterov's accelerated gradient (NAG) and Adaptive Gradient Algorithm (Adagrad) respectively with this gradient variant and evaluate the enhanced algorithms on several datasets. Experimental results show that the enhanced methods have a state-of-the-art performance in convergence speed compared to the naive first-order gradient methods. We then adopt the enhanced NAG method to implement homomorphic logistic regression training and obtain a comparable result by only $3$ iterations.
翻译:过去几年中,加密数据上的逻辑回归训练一直是一个备受关注的安全问题。本文提出了一种更快的梯度变体,称为“二次梯度”,用于在同态加密领域中实现逻辑回归训练。其核心可以看作是简化的固定黑塞矩阵的扩展。我们分别将Nesterov加速梯度(NAG)和自适应梯度算法(Adagrad)与该梯度变体相结合,并在多个数据集上进行了评估。实验结果表明,增强方法在收敛速度方面与朴素的一阶梯度方法相比具有最先进的性能。随后,我们采用增强的NAG方法来实现同态逻辑回归训练,并仅进行了3次迭代就获得了可比较的结果。