Deep learning has achieved remarkable progress for visual recognition on large-scale balanced datasets but still performs poorly on real-world long-tailed data. Previous methods often adopt class re-balanced training strategies to effectively alleviate the imbalance issue, but might be a risk of over-fitting tail classes. The recent decoupling method overcomes over-fitting issues by using a multi-stage training scheme, yet, it is still incapable of capturing tail class information in the feature learning stage. In this paper, we show that soft label can serve as a powerful solution to incorporate label correlation into a multi-stage training scheme for long-tailed recognition. The intrinsic relation between classes embodied by soft labels turns out to be helpful for long-tailed recognition by transferring knowledge from head to tail classes. Specifically, we propose a conceptually simple yet particularly effective multi-stage training scheme, termed as Self Supervised to Distillation (SSD). This scheme is composed of two parts. First, we introduce a self-distillation framework for long-tailed recognition, which can mine the label relation automatically. Second, we present a new distillation label generation module guided by self-supervision. The distilled labels integrate information from both label and data domains that can model long-tailed distribution effectively. We conduct extensive experiments and our method achieves the state-of-the-art results on three long-tailed recognition benchmarks: ImageNet-LT, CIFAR100-LT and iNaturalist 2018. Our SSD outperforms the strong LWS baseline by from $2.7\%$ to $4.5\%$ on various datasets. The code is available at https://github.com/MCG-NJU/SSD-LT.
翻译:深层学习在大规模均衡数据集的视觉识别方面取得了显著进展,但在真实世界长尾数据上仍然表现不佳。 以往的方法通常采用班级重新平衡培训战略,以有效缓解不平衡问题,但可能存在超装尾品类的风险。 最近的脱钩方法通过多阶段培训计划克服了超装问题,然而,它仍然无法在特征学习阶段捕捉尾品类信息。 在本文中,我们显示软标签可以作为一种强有力的解决方案,将标签相关性纳入长期快速识别的多阶段培训计划。 软标签所体现的班级之间的内在关系往往有助于通过将知识从头类转移到尾类来长期识别。 具体地说,我们提出了一个概念简单而特别有效的多阶段培训计划,称为“自超到蒸馏(SSD) ” 。 这个计划由两个部分组成。 首先,我们引入一个自毁的自我智能智能校正框架,可以自动消除标签关系。 其次,我们展示了一个新的蒸馏标签生成模块模块模块模块-100 由自我升级的SLILFS 进行长期的标签。 继续通过长期的标签, 我们的标签进行长期的标签化的标签, 我们的标签的标签, 将数据进行长期的校正的校正的校正。