We propose Federated Accelerated Stochastic Gradient Descent (FedAc), a principled acceleration of Federated Averaging (FedAvg, also known as Local SGD) for distributed optimization. FedAc is the first provable acceleration of FedAvg that improves convergence speed and communication efficiency on various types of convex functions. For example, for strongly convex and smooth functions, when using $M$ workers, the previous state-of-the-art FedAvg analysis can achieve a linear speedup in $M$ if given $M$ rounds of synchronization, whereas FedAc only requires $M^{\frac{1}{3}}$ rounds. Moreover, we prove stronger guarantees for FedAc when the objectives are third-order smooth. Our technique is based on a potential-based perturbed iterate analysis, a novel stability analysis of generalized accelerated SGD, and a strategic tradeoff between acceleration and stability.
翻译:我们提议采用联邦加速蒸汽梯子(FedAc),即联邦加速加速加速加速法(FedAvg,又称地方SGD)进行分配优化,FedAc是联邦加速法的第一次可证实的加速法,它提高了各种 convex 功能的趋同速度和通信效率。例如,如果使用美元工人,那么,以往的联邦最先进的分析可以实现线性加速,如果提供美元同步周期,则以美元为单位,而联邦加速法则只需要美元(M ⁇ frac){1 ⁇ 3 ⁇ 美元。此外,在目标达到第三级平稳时,我们证明对联邦加速法的保障更加有力。我们的技术基于基于潜在潜伏的外延分析、对普遍加速的SGD进行新的稳定分析,以及加速与稳定之间的战略权衡。