In dialogue state tracking, dialogue history is a crucial material, and its utilization varies between different models. However, no matter how the dialogue history is used, each existing model uses its own consistent dialogue history during the entire state tracking process, regardless of which slot is updated. Apparently, it requires different dialogue history to update different slots in different turns. Therefore, using consistent dialogue contents may lead to insufficient or redundant information for different slots, which affects the overall performance. To address this problem, we devise DiCoS-DST to dynamically select the relevant dialogue contents corresponding to each slot for state updating. Specifically, it first retrieves turn-level utterances of dialogue history and evaluates their relevance to the slot from a combination of three perspectives: (1) its explicit connection to the slot name; (2) its relevance to the current turn dialogue; (3) Implicit Mention Oriented Reasoning. Then these perspectives are combined to yield a decision, and only the selected dialogue contents are fed into State Generator, which explicitly minimizes the distracting information passed to the downstream state prediction. Experimental results show that our approach achieves new state-of-the-art performance on MultiWOZ 2.1 and MultiWOZ 2.2, and achieves superior performance on multiple mainstream benchmark datasets (including Sim-M, Sim-R, and DSTC2).
翻译:在对话状态跟踪中,对话历史是关键材料,对话历史是不同模式之间使用的一种关键材料,对话历史是不同模式之间的不同利用情况。然而,无论对话历史如何使用,每个现有模式都在整个国家跟踪过程中使用自己的一致对话历史,而不论时间档是更新的。显然,它需要不同的对话历史来更新不同的空档。因此,使用一致对话内容可能导致不同空档信息不足或多余,从而影响整体性能。为了解决这一问题,我们设计DicoS-DST来动态地选择与每个空档相对应的相关对话内容进行更新。具体地说,它首先从三个角度组合中检索对话历史的翻转档层话,并评估它们与空档的相关性:(1) 它与空档名称的明确连接;(2) 它与当前空档对话的相关性;(3) 因此,使用一致的对话内容可能会导致不同空洞信息,从而影响总体性能。为了解决这一问题,我们设计DicoOS-DST, 将选定的对话内容输入国家发电机,从而明确将传递给下游州预测的转移信息降到最低程度。实验结果显示,我们的方法在多维WZ2.1和多级SIM-ST-M-SIM-SBSBSBSBSBSBSBSBSBSBSBSBSBSDSBSBSDSDSBSBSDSBSBSDSBSDSBSDSDSDSDSDSDSDSDSDSBSDSDSBSBSDSDSDSBSDSDSDSDSDSDSDSDSBDSDSDSBSBSDSDSDSDSBSDSDSDSDSDSDSDSDSDSDSDSDSDSDSDSDSDSDSDSDSDSDSDSDSDSDSDDDDDDDDDDDDDDSDSDSDSDSDDDDDDSDDDDDDSDDSDSDSDSDSDSDSDDDDDDSDSDSDSDDSDSDSDDDDDD