We propose Local Momentum Tracking (LMT), a novel distributed stochastic gradient method for solving distributed optimization problems over networks. To reduce communication overhead, LMT enables each agent to perform multiple local updates between consecutive communication rounds. Specifically, LMT integrates local updates with the momentum tracking strategy and the Loopless Chebyshev Acceleration (LCA) technique. We demonstrate that LMT achieves linear speedup with respect to the number of local updates as well as the number of agents for minimizing smooth objective functions. Moreover, with sufficiently many local updates ($Q\geq Q^*$), LMT attains the optimal communication complexity. For a moderate number of local updates ($Q\in[1,Q^*]$), it achieves the optimal iteration complexity. To our knowledge, LMT is the first method that enjoys such properties.
翻译:暂无翻译