Recently, prompt-tuning has achieved promising results on some few-shot classification tasks. The core idea of prompt-tuning is to insert text pieces, i.e., templates, into the input and transform a classification task into a masked language modeling problem. However, as for relation extraction, determining the appropriate prompt template requires domain expertise. Single label word handcrafted or auto-searched is cumbersome and time-consuming to verify their effectiveness in non-few-shot scenarios. Further, there exist abundant semantic knowledge among the entities and relation labels which cannot be ignored. To this end, we focus on incorporating knowledge into prompt-tuning for relation extraction and propose a Knowledge-aware prompt-tuning with synergistic optimization (KNIGHT) approach. Specifically, we inject entity and relation knowledge into prompt construction with learnable virtual template words and answer words and jointly optimize their representation with knowledge constraints. Extensive experimental results on five datasets with standard and low-resource settings demonstrate the effectiveness of our approach.
翻译:最近,快速调试在一些微小的分类任务上取得了可喜的成果。 快速调试的核心理念是将文字片段(即模板)插入输入中,并将分类任务转化为隐蔽的语言模型问题。然而,关于关系提取,确定适当的快速模板需要域内的专门知识。 单一标签单词手工制作或自动搜索十分繁琐且耗时,以核实其在非微小的情景中的有效性。 此外,各实体之间有大量的语义知识和关系标签,不能忽视。 为此,我们注重将知识纳入关系提取的快速调试,并提议采用协同优化(KNight)方法进行知识意识快速调控。具体地说,我们用可学习的虚拟模板词和回答词将知识与快速构建相联系,并在知识制约下共同优化其代表性。在有标准和低资源环境的五个数据集上的广泛实验结果证明了我们的方法的有效性。