The effective application of contrastive learning technology in natural language processing tasks shows the superiority of contrastive learning in text analysis tasks. How to construct positive and negative samples correctly and reasonably is the core challenge of contrastive learning. Since it is difficult to construct contrastive objects in multi-label multi-classification tasks, there are few contrastive losses for multi-label multi-classification text classification. In this paper, we propose five contrastive losses for multi-label multi-classification tasks. They are Strict Contrastive Loss (SCL), Intra-label Contrastive Loss (ICL), Jaccard Similarity Contrastive Loss (JSCL), and Jaccard Similarity Probability Contrastive Loss (JSPCL) and Stepwise Label Contrastive Loss (SLCL). We explore the effectiveness of contrastive learning for multi-label multi-classification tasks under different strategies, and provide a set of baseline methods for contrastive learning techniques on multi-label classification tasks. We also perform an interpretability analysis of our approach to show how different contrastive learning methods play their roles. The experimental results in this paper demonstrate that our proposed contrastive losses can bring some improvement for multi-label multi-classification tasks. Our work reveal how to "appropriately" change the contrastive way of contrastive learning is the key idea to improve the adaptability of contrastive learning in multi-label multi-classification tasks.
翻译:在自然语言处理任务中有效应用对比式学习技术显示了在文本分析任务中对比性学习的优势。如何正确和合理地构建正反抽样是对比性学习的核心挑战。由于在多标签多分类任务中难以构建对比性对象,多标签多分类文本分类分类的对比性学习损失很少。在本文件中,我们建议多标签多分类多分类任务中存在五种对比性损失。它们是严格的对比性损失(SCL)、标签内对比性损失(ICL)、相近性相似性对比性损失(JSCL)以及相近性相似性对比性损失(JSPCL)和相近性对比性损失(SLCL)。我们探讨了不同战略下多标签多分类任务对比性学习的对比性学习效果的有效性,并为多标签分类任务的对比性学习技术提供了一套基准方法。我们还对方法进行了解释性分析,以显示差异性学习方法如何发挥作用。本文的实验结果表明,我们提议的对比性相似性差异性对比性对比性差异性对比性对比性对比性对比性损失和渐进性多等级学习任务如何改进多标签式学习的多等级任务。我们学习的关键性学习任务,改进了多标签式学习的对比性工作。