Spoken Language Understanding (SLU), including intent detection and slot filling, is a core component in human-computer interaction. The natural attributes of the relationship among the two subtasks make higher requirements on fine-grained feature interaction, i.e., the token-level intent features and slot features. Previous works mainly focus on jointly modeling the relationship between the two subtasks with attention-based models, while ignoring the exploration of attention order. In this paper, we propose to replace the conventional attention with our proposed Bilinear attention block and show that the introduced Higher-order Attention Network (HAN) brings improvement for the SLU task. Importantly, we conduct wide analysis to explore the effectiveness brought from the higher-order attention.
翻译:口语理解(SLU),包括意图探测和空档填充,是人与计算机互动的核心组成部分,这两个子任务之间关系的自然属性要求细微的特征互动(即象征性的意向特征和空档特征)的要求更高。以前的工作主要侧重于用关注模型共同模拟这两个子任务之间的关系,同时忽视对关注顺序的探索。在本文件中,我们提议用我们提议的双线关注块取代常规关注,并表明引入的高级关注网络为SLU的任务带来了改进。重要的是,我们进行了广泛的分析,以探讨从更高层次关注中引出的效力。