Logistic regression training over encrypted data has been an attractive idea to security concerns for years. In this paper, we propose a faster gradient variant called $\texttt{quadratic gradient}$ to implement logistic regression training in a homomorphic encryption domain, the core of which can be seen as an extension of the simplified fixed Hessian. We enhance Nesterov's accelerated gradient (NAG) and Adaptive Gradient Algorithm (Adagrad) respectively with this gradient variant and evaluate the enhanced algorithms on several datasets. Experimental results show that the enhanced methods have a state-of-the-art performance in convergence speed compared to the naive first-order gradient methods. We then adopt the enhanced NAG method to implement homomorphic logistic regression training and obtain a comparable result by only $3$ iterations.
翻译:多年来,对加密数据进行后勤回归培训对于安全考虑来说是一个有吸引力的想法。 在本文中,我们提出了一个称为$\tt{quadratic equality}$的更快的梯度变量,用于在同原加密域进行后勤回归培训,其核心可被视为简化的固定Hessian的延伸。我们用这个梯度变量分别加强Nesterov的加速梯度和适应性梯度Agorithm(Adagrad),并评价数个数据集的强化算法。实验结果表明,与天真的一级梯度方法相比,强化方法在趋同速度方面具有最先进的性能。然后,我们采用了强化的NAG方法来实施同原逻辑回归培训,并以仅3美元的迭代法获得类似的结果。