In the setting of federated optimization, where a global model is aggregated periodically, step asynchronism occurs when participants conduct model training by efficiently utilizing their computational resources. It is well acknowledged that step asynchronism leads to objective inconsistency under non-i.i.d. data, which degrades the model's accuracy. To address this issue, we propose a new algorithm FedaGrac, which calibrates the local direction to a predictive global orientation. Taking advantage of the estimated orientation, we guarantee that the aggregated model does not excessively deviate from the global optimum while fully utilizing the local updates of faster nodes. We theoretically prove that FedaGrac holds an improved order of convergence rate than the state-of-the-art approaches and eliminates the negative effect of step asynchronism. Empirical results show that our algorithm accelerates the training and enhances the final accuracy.
翻译:在联邦优化的设置中,全球模型定期汇总,当参与者通过有效利用其计算资源进行模型培训时,就会出现步步同步现象。人们公认,在非i.i.d.数据下,步骤同步导致客观的不一致,从而降低模型的准确性。为了解决这一问题,我们提议一种新的算法FedaGrac,将地方方向与预测性全球方向相校准。利用这一估计方向,我们保证综合模型不会过分偏离全球最佳模式,同时充分利用当地更快节点的最新数据。 我们理论上证明,FedaGrac的趋同率高于最新方法,消除了步态同步主义的负面影响。 经验性结果显示,我们的算法加快了培训并提高了最终准确性。</s>