In this work, we focus on a more challenging few-shot intent detection scenario where many intents are fine-grained and semantically similar. We present a simple yet effective few-shot intent detection schema via contrastive pre-training and fine-tuning. Specifically, we first conduct self-supervised contrastive pre-training on collected intent datasets, which implicitly learns to discriminate semantically similar utterances without using any labels. We then perform few-shot intent detection together with supervised contrastive learning, which explicitly pulls utterances from the same intent closer and pushes utterances across different intents farther. Experimental results show that our proposed method achieves state-of-the-art performance on three challenging intent detection datasets under 5-shot and 10-shot settings.
翻译:在这项工作中,我们侧重于一个更具挑战性、少见、少见的意向探测方案,其中许多意图都是精细的和精密相似的。我们通过对比性训练前和微调提出一个简单而有效的少见的意向探测方案。具体地说,我们首先对收集到的意向数据集进行自我监督的对比前训练,这种训练不言而喻地学会如何在不使用任何标签的情况下区分语义上相似的言词。然后,我们进行少见的意向探测,同时在监督下进行对比性学习,这明确拉近了同一意图的言词,并将不同意图的言词推向更远处。实验结果显示,我们拟议的方法在5分和10分立的三种挑战性意向探测数据集上取得了最先进的表现。