Few-shot relation extraction (FSRE) focuses on recognizing novel relations by learning with merely a handful of annotated instances. Meta-learning has been widely adopted for such a task, which trains on randomly generated few-shot tasks to learn generic data representations. Despite impressive results achieved, existing models still perform suboptimally when handling hard FSRE tasks, where the relations are fine-grained and similar to each other. We argue this is largely because existing models do not distinguish hard tasks from easy ones in the learning process. In this paper, we introduce a novel approach based on contrastive learning that learns better representations by exploiting relation label information. We further design a method that allows the model to adaptively learn how to focus on hard tasks. Experiments on two standard datasets demonstrate the effectiveness of our method.
翻译:略微少见的关系提取( FSRE) 侧重于通过学习来认识新关系,只用少数附带说明的例子。 元学习被广泛用于这一任务,通过随机生成的微小任务培训来学习通用数据表述。 尽管取得了令人印象深刻的成果,但现有的模型在处理硬性FSRE任务时仍然发挥副优势,而这种关系是细微的,彼此相似的。 我们争论这主要是因为现有的模型没有区分学习过程中的简单任务和简单任务。 在本文中,我们采用了基于对比性学习的新办法,通过利用关系标签信息学习更好的表述。我们进一步设计了一种方法,使模型能够适应性地学习如何专注于硬任务。对两个标准数据集的实验显示了我们方法的有效性。