Deep convolutional neural networks often perform poorly when faced with datasets that suffer from quantity imbalances and classification difficulties. Despite advances in the field, existing two-stage approaches still exhibit dataset bias or domain shift. To counter this, a phased progressive learning schedule has been proposed that gradually shifts the emphasis from representation learning to training the upper classifier. This approach is particularly beneficial for datasets with larger imbalances or fewer samples. Another new method a coupling-regulation-imbalance loss function is proposed, which combines three parts: a correction term, Focal loss, and LDAM loss. This loss is effective in addressing quantity imbalances and outliers, while regulating the focus of attention on samples with varying classification difficulties. These approaches have yielded satisfactory results on several benchmark datasets, including Imbalanced CIFAR10, Imbalanced CIFAR100, ImageNet-LT, and iNaturalist 2018, and can be easily generalized to other imbalanced classification models.
翻译:在面临数量不平衡和分类困难的数据集时,深卷神经网络往往表现不佳。尽管在实地取得了进步,但现有的两阶段方法仍然表现出数据集偏差或领域转移。对此,建议分阶段逐步学习时间表,将重点从代表性学习逐步转向培训高分类人员。这种方法对不平衡程度较大或样本较少的数据集特别有益。还提出了另一个新方法,即混合-调节-平衡损失功能,它包括三个部分:修正术语、焦点损失和LDAM损失。这一损失在解决数量不平衡和外部值方面是有效的,同时调节对不同分类困难的样本的关注重点。这些方法在一些基准数据集上取得了令人满意的结果,包括Immm均衡的 CIFAR10、Immmmlav100、图像网络-LT和iNatulist 2018, 并且很容易被其他不平衡的分类模型所推广。</s>