Logistic regression training on an encrypted dataset has been an attractive idea to security concerns for years. In this paper, we propose a faster gradient variant called Quadratic Gradient for logistic regression and implement it via a special homomorphic encryption scheme. The core of this gradient variant can be seen as an extension of the simplified fixed Hessian from Newton's method, which extracts information from the Hessian matrix into the naive gradient, and thus can be used to enhance Nesterov's accelerated gradient (NAG), Adagrad, etc. We evaluate various gradient $ascent$ methods with this gradient variant on the gene dataset provided by the 2017 iDASH competition and the image dataset from the MNIST database. Experimental results show that the enhanced methods converge faster and sometimes even to a better convergence result. We also implement the gradient variant in full batch NAG and mini-batch NAG for training a logistic regression model on a large dataset in the encrypted domain. Equipped with this gradient variant, full batch NAG and mini-batch NAG are both faster than the original ones.
翻译:加密数据集的后勤回归培训多年来一直对安全关切具有吸引力。 在本文中, 我们提出一个更快的梯度变异, 称为 夸德里亚梯度梯度梯度梯度梯度梯度梯度变, 用于后勤回归, 并通过一个特殊的同质加密方案加以实施。 这个梯度变变数的核心可被视为牛顿方法简化固定的赫森的延伸, 牛顿方法将赫森信息从赫森矩阵中提取到天性梯度, 从而可用于加强 Nesterov 加速梯度( NAG ) 、 Adagrad 等 。 我们用2017 年iDASH 竞赛提供的基因数据集和MNIST数据库图像数据集的梯度变数梯度变数来评估各种梯度美元法。 实验结果显示, 强化方法的趋同速度更快, 有时甚至更趋近于更好的趋同结果。 我们还在全批NAGG 和 Mini-batch NAG 中实施梯度变法, 用于在加密域大型数据集上培训物流回归模型模型。 。 使用这个梯度变, 全套 NAAG 和小包 NAG 和小包比原始的NAGG 都快。