In the setting of federated optimization, where a global model is aggregated periodically, step asynchronism occurs when participants conduct model training with fully utilizing their computational resources. It is well acknowledged that step asynchronism leads to objective inconsistency under non-i.i.d. data, which degrades the model accuracy. To address this issue, we propose a new algorithm \texttt{FedaGrac}, which calibrates the local direction to a predictive global orientation. Taking the advantage of estimated orientation, we guarantee that the aggregated model does not excessively deviate from the expected orientation while fully utilizing the local updates of faster nodes. We theoretically prove that \texttt{FedaGrac} holds an improved order of convergence rate than the state-of-the-art approaches and eliminates the negative effect of step asynchronism. Empirical results show that our algorithm accelerates the training and enhances the final accuracy.
翻译:在联邦优化的设置中,在定期汇总全球模型的情况下,当参与者利用自己的计算资源进行模型培训时,就会出现步态不同步现象。人们公认,在非i.i.d.数据下,步骤不同步会导致客观的不一致,从而降低模型的准确性。为了解决这个问题,我们提议一种新的算法 \ textt{FedaGrac},它将地方方向与预测性全球方向相校准。我们利用估计方向,保证综合模型不会过分偏离预期方向,同时充分利用本地更快节点的更新。我们理论上证明\ textt{FedaGrac} 的趋同率比最新方法要高,并消除步骤不同步主义的负面影响。经验性结果显示,我们的算法加快了培训,提高了最终的准确性。