Few-shot relation learning refers to infer facts for relations with a limited number of observed triples. Existing metric-learning methods for this problem mostly neglect entity interactions within and between triples. In this paper, we explore this kind of fine-grained semantic meanings and propose our model TransAM. Specifically, we serialize reference entities and query entities into sequence and apply transformer structure with local-global attention to capture both intra- and inter-triple entity interactions. Experiments on two public benchmark datasets NELL-One and Wiki-One with 1-shot setting prove the effectiveness of TransAM.
翻译:少见的关联学习是指与数量有限的已观察到的三重实体的关系的推断事实。 这一问题的现有衡量学习方法大多忽视了三重实体内部和三重实体之间的实体互动。 在本文中,我们探索了这种精细的语义含义,并提出了我们的TranAM模型。 具体地说,我们将参考实体和查询实体按顺序排列,并应用具有地方-全球关注的变压器结构来捕捉实体内部和三重实体之间的互动。 对两个公共基准数据集NELL-One和Wiki-One的实验证明了TranAM的有效性。