Federated learning has attracted increasing attention due to the promise of balancing privacy and large-scale learning; numerous approaches have been proposed. However, most existing approaches focus on problems with balanced data, and prediction performance is far from satisfactory for many real-world applications where the number of samples in different classes is highly imbalanced. To address this challenging problem, we developed a novel federated learning method for imbalanced data by directly optimizing the area under curve (AUC) score. In particular, we formulate the AUC maximization problem as a federated compositional minimax optimization problem, develop a local stochastic compositional gradient descent ascent with momentum algorithm, and provide bounds on the computational and communication complexities of our algorithm. To the best of our knowledge, this is the first work to achieve such favorable theoretical results. Finally, extensive experimental results confirm the efficacy of our method.
翻译:联邦学习因平衡隐私和大规模学习的承诺而受到越来越多的关注。已经提出了许多方法,但大多数现有方法都关注平衡数据的问题,并且在许多实际应用中,不同类别的样本数量高度不平衡,预测性能远远不能令人满意。为了解决这个具有挑战性的问题,我们开发了一种通过直接优化曲线下面积(AUC)得分来适用于不平衡数据的新型联邦学习方法。特别地,我们将AUC最大化问题阐述为联邦组合极小极大优化问题,开发了一种带有动量的局部随机组合梯度下降升序算法,并提供了我们算法的计算和通信复杂度的界限。据我们所知,这是第一个实现如此有利的理论结果的工作。最后,广泛的实验结果证实了我们的方法的功效。