The networks trained on the long-tailed dataset vary remarkably, despite the same training settings, which shows the great uncertainty in long-tailed learning. To alleviate the uncertainty, we propose a Nested Collaborative Learning (NCL), which tackles the problem by collaboratively learning multiple experts together. NCL consists of two core components, namely Nested Individual Learning (NIL) and Nested Balanced Online Distillation (NBOD), which focus on the individual supervised learning for each single expert and the knowledge transferring among multiple experts, respectively. To learn representations more thoroughly, both NIL and NBOD are formulated in a nested way, in which the learning is conducted on not just all categories from a full perspective but some hard categories from a partial perspective. Regarding the learning in the partial perspective, we specifically select the negative categories with high predicted scores as the hard categories by using a proposed Hard Category Mining (HCM). In the NCL, the learning from two perspectives is nested, highly related and complementary, and helps the network to capture not only global and robust features but also meticulous distinguishing ability. Moreover, self-supervision is further utilized for feature enhancement. Extensive experiments manifest the superiority of our method with outperforming the state-of-the-art whether by using a single model or an ensemble.
翻译:长尾数据集培训的网络差别很大,尽管培训环境相同,表明长尾学习有很大的不确定性。为了缓解不确定性,我们建议采用Nested合作学习(NCL),通过合作学习多个专家共同解决这一问题。NCL由两个核心部分组成,即Nested个人学习(NIL)和Nested平衡在线蒸馏(NBOD),这两个核心部分分别侧重于每个单一专家的个人监督学习和多个专家之间的知识转让。为了更透彻地了解各种表现,NIL和NBOD都是以嵌套方式制定的,其中不仅从全部角度对所有类别进行学习,而且从部分角度对一些硬类别进行学习。关于部分学习,我们通过使用拟议的硬分类采矿(HCM),专门选择预测得分高的负面类别作为硬类别。在NCL中,从两个角度进行学习是嵌套的、高度关联和互补的,并且帮助网络不仅从全球和稳健的特征中捕捉取,而且精确的区分能力。此外,自我监督的图像还被进一步利用一个高超标的模型来提高地标。