Intent detection and slot filling are two main tasks in natural language understanding (NLU) for identifying users' needs from their utterances. These two tasks are highly related and often trained jointly. However, most previous works assume that each utterance only corresponds to one intent, ignoring the fact that a user utterance in many cases could include multiple intents. In this paper, we propose a novel Self-Distillation Joint NLU model (SDJN) for multi-intent NLU. First, we formulate multiple intent detection as a weakly supervised problem and approach with multiple instance learning (MIL). Then, we design an auxiliary loop via self-distillation with three orderly arranged decoders: Initial Slot Decoder, MIL Intent Decoder, and Final Slot Decoder. The output of each decoder will serve as auxiliary information for the next decoder. With the auxiliary knowledge provided by the MIL Intent Decoder, we set Final Slot Decoder as the teacher model that imparts knowledge back to Initial Slot Decoder to complete the loop. The auxiliary loop enables intents and slots to guide mutually in-depth and further boost the overall NLU performance. Experimental results on two public multi-intent datasets indicate that our model achieves strong performance compared to others.
翻译:在自然语言理解(NLU)中,本意探测和空档填充是确定用户需要的两大主要任务,即自然语言理解(NLU)中,从语言表达中确定用户需要。这两项任务高度相关,经常是联合培训。然而,大多数前的工作假设,每个发音只对应一个意图,忽略了在许多情况下用户的发音可能包含多重意图的事实。在本文件中,我们为多发性NLU提出了一个新的自我蒸馏联合NLU模型(SDJN) 。首先,我们将多重意向探测作为受监管薄弱的问题,并采用多种实例学习(MIL)的方法。然后,我们通过自我蒸馏设计一个辅助循环,由三个有序安排的解调器组成:初始滑解码器、MIL Intent 解码器和最终脱码器。每个解码器的输出将作为下一个解码器的辅助信息。根据MIL Intent Decoder提供的辅助知识,我们将最终滴解码器设置为教师模型,将知识传回到初始解码器到初始解码器到初始解析器到完成整个循环。辅助循环的双导导线使双实验性实验性运行能够实现整个循环。在多个实验性实验性业绩和多式数据槽中进一步显示结果。