We propose and analyze a stochastic Newton algorithm for homogeneous distributed stochastic convex optimization, where each machine can calculate stochastic gradients of the same population objective, as well as stochastic Hessian-vector products (products of an independent unbiased estimator of the Hessian of the population objective with arbitrary vectors), with many such stochastic computations performed between rounds of communication. We show that our method can reduce the number, and frequency, of required communication rounds compared to existing methods without hurting performance, by proving convergence guarantees for quasi-self-concordant objectives (e.g., logistic regression), alongside empirical evidence.
翻译:我们提议并分析一个用于同质分布式随机孔流优化的随机牛顿算法,其中每台机器都可以计算出同一人口目标的随机梯度,以及随机热吸附器产品(独立、无偏倚地测量人口目标的产物,带有任意矢量),并在各轮通信之间进行许多此类随机计算。我们表明,与现有方法相比,我们的方法可以减少所需通信回合的数量和频率,但不会损害性能,通过证明对准自相协调目标(如后勤回归)的趋同保证,以及经验证据。