Few-Shot Relation Extraction aims at predicting the relation for a pair of entities in a sentence by training with a few labelled examples in each relation. Some recent works have introduced relation information (i.e., relation labels or descriptions) to assist model learning based on Prototype Network. However, most of them constrain the prototypes of each relation class implicitly with relation information, generally through designing complex network structures, like generating hybrid features, combining with contrastive learning or attention networks. We argue that relation information can be introduced more explicitly and effectively into the model. Thus, this paper proposes a direct addition approach to introduce relation information. Specifically, for each relation class, the relation representation is first generated by concatenating two views of relations (i.e., [CLS] token embedding and the mean value of embeddings of all tokens) and then directly added to the original prototype for both train and prediction. Experimental results on the benchmark dataset FewRel 1.0 show significant improvements and achieve comparable results to the state-of-the-art, which demonstrates the effectiveness of our proposed approach. Besides, further analyses verify that the direct addition is a much more effective way to integrate the relation representations and the original prototypes.
翻译:几小节关系提取法的目的是通过培训预测一对一对实体在一句话中的关系,在每个关系中都有几个贴标签的例子。最近的一些著作引入了关系信息(即关系标签或描述),以协助基于原型网络的示范学习。然而,它们大多通过设计复杂的网络结构,例如产生混合特征,与对比学习或关注网络相结合,间接限制每个关系类的原型,通常通过设计复杂的网络结构,例如生成混合特征,与对比学习或关注网络相结合。我们认为,在模型中可以更明确和有效地引入关系信息。因此,本文件建议直接增加一个介绍关系信息的方法。具体地说,对于每一关系类来说,关系代表首先通过搭配两种关系观点(即[CLS]象征性嵌入和所有象征嵌入的平均值)产生关系信息,然后直接添加到原始的火车和预测原型。基准数据集“小Rel 1.0”的实验结果显示显著的改进,并取得可比较的结果,表明我们拟议办法的有效性。此外,进一步的分析还证实,直接增加直接增加是将原型关系纳入原型关系的一种非常有效的办法。