Class imbalance distribution widely exists in real-world engineering. However, the mainstream optimization algorithms that seek to minimize error will trap the deep learning model in sub-optimums when facing extreme class imbalance. It seriously harms the classification precision, especially on the minor classes. The essential reason is that the gradients of the classifier weights are imbalanced among the components from different classes. In this paper, we propose Attraction-Repulsion-Balanced Loss (ARB-Loss) to balance the different components of the gradients. We perform experiments on the large-scale classification and segmentation datasets and our ARB-Loss can achieve state-of-the-art performance via only one-stage training instead of 2-stage learning like nowadays SOTA works.
翻译:在现实世界工程中,分类不平衡分布广泛存在。然而,在面临极端分类不平衡时,主流优化算法试图将错误降到最低程度,从而将深学习模式困在亚最佳模式中。这严重损害了分类精确性,特别是小类的分类精确性。关键的原因是,分类器重量的梯度在不同类别的组成部分之间不平衡。在本文中,我们提议对梯度的不同组成部分进行平衡。我们在大规模分类和分解数据集上进行实验,我们的ARB-Loss只能通过一阶段培训而不是像现在的SOTA那样的两阶段学习来达到最先进的性能。