Lexical relations describe how concepts are semantically related, in the form of relation triples. The accurate prediction of lexical relations between concepts is challenging, due to the sparsity of patterns indicating the existence of such relations. We propose the Knowledge-Enriched Meta-Learning (KEML) framework to address the task of lexical relation classification. In KEML, the LKB-BERT (Lexical Knowledge Base-BERT) model is presented to learn concept representations from massive text corpora, with rich lexical knowledge injected by distant supervision. A probabilistic distribution of auxiliary tasks is defined to increase the model's ability to recognize different types of lexical relations. We further combine a meta-learning process over the auxiliary task distribution and supervised learning to train the neural lexical relation classifier. Experiments over multiple datasets show that KEML outperforms state-of-the-art methods.
翻译:精确预测概念之间的词汇关系具有挑战性,因为表明这种关系存在的模式繁多。我们提议了知识丰富元学习框架,以解决词汇关系分类的任务。在KEML, LKB-BERT (Lexical Knowledge Base-BERT) 模型用于从大量文本公司学习概念表述,该公司在遥远的监督下注入丰富的词汇知识。 辅助任务的概率分布被确定为增强模型识别不同类型词汇关系的能力。我们进一步结合了辅助任务分配的元学习过程,并监督了对神经词汇关系分类师的培训。对多个数据集的实验表明,KEML超越了现代方法。