Dependency trees convey rich structural information that is proven useful for extracting relations among entities in text. However, how to effectively make use of relevant information while ignoring irrelevant information from the dependency trees remains a challenging research question. Existing approaches employing rule based hard-pruning strategies for selecting relevant partial dependency structures may not always yield optimal results. In this work, we propose Attention Guided Graph Convolutional Networks (AGGCNs), a novel model which directly takes full dependency trees as inputs. Our model can be understood as a soft-pruning approach that automatically learns how to selectively attend to the relevant sub-structures useful for the relation extraction task. Extensive results on various tasks including cross-sentence n-ary relation extraction and large-scale sentence-level relation extraction show that our model is able to better leverage the structural information of the full dependency trees, giving significantly better results than previous approaches.
翻译:依赖性树木传递丰富的结构信息,事实证明这些信息对于在案文中提取实体之间的关系是有用的。然而,如何有效地利用相关信息而忽视依赖性树木的无关信息仍然是一个具有挑战性的研究问题。在选择相关的部分依赖性结构时,采用基于规则的硬调整战略的现有方法不一定总能产生最佳效果。在这项工作中,我们建议 " 引领革命网络 " (AGGCNs)是一种新颖的模式,直接将完全依赖性树木作为投入。我们的模式可以被理解为一种软调整方法,自动学习如何有选择地处理相关次级结构,而该次级结构对关系提取任务有用。关于各种任务的广泛结果,包括交叉连接关系提取和大规模判决级关系提取,表明我们的模型能够更好地利用完全依赖性树木的结构信息,其结果比以往的方法要好得多。