The relation classification is to identify semantic relations between two entities in a given text. While existing models perform well for classifying inverse relations with large datasets, their performance is significantly reduced for few-shot learning. In this paper, we propose a function words adaptively enhanced attention framework (FAEA) for few-shot inverse relation classification, in which a hybrid attention model is designed to attend class-related function words based on meta-learning. As the involvement of function words brings in significant intra-class redundancy, an adaptive message passing mechanism is introduced to capture and transfer inter-class differences.We mathematically analyze the negative impact of function words from dot-product measurement, which explains why message passing mechanism effectively reduces the impact. Our experimental results show that FAEA outperforms strong baselines, especially the inverse relation accuracy is improved by 14.33% under 1-shot setting in FewRel1.0.
翻译:关系分类是为了在给定文本中识别两个实体之间的语义关系。 虽然现有的模型在对与大型数据集的反向关系进行分类方面表现良好, 但其性能却在微小的学习中显著下降。 在本文中, 我们为微小的反向关系分类提出了一个功能词“ 适应性增强注意框架 ” ( FEA) 。 在这种分类中, 设计了一个混合关注模式, 以包含基于元学习的与阶级相关的函数字。 随着功能字的介入带来重大的阶级内部冗余, 引入了一个适应性信息传递机制来捕捉和转移不同类别之间的差异。 我们从数学角度分析了点产品测量函数词的消极影响, 这解释了传递信息机制为何有效减小了影响。 我们的实验结果表明, FAEA 超越了强的基线, 特别是远端Rel1.0 1 下的反向精确度提高了14.33% 。