We introduce SpERT, an attention model for span-based joint entity and relation extraction. Our key contribution is a light-weight reasoning on BERT embeddings, which features entity recognition and filtering, as well as relation classification with a localized, marker-free context representation. The model is trained using strong within-sentence negative samples, which are efficiently extracted in a single BERT pass. These aspects facilitate a search over all spans in the sentence. In ablation studies, we demonstrate the benefits of pre-training, strong negative sampling and localized context. Our model outperforms prior work by up to 2.6% F1 score on several datasets for joint entity and relation extraction.
翻译:我们引入了基于跨边界的联合实体和关系提取的注意模型SpERT。 我们的主要贡献是对BERT嵌入进行轻量级推理,其特点是实体识别和过滤,以及与局部、无标记背景代表的关系分类。该模型在培训时使用了强大的判决内负面样本,这些样本在BERT的通行证中有效提取。这些方面有助于在句子中对所有各处进行搜索。在反通货膨胀研究中,我们展示了培训前、强烈负面取样和局部背景的好处。我们的模型比以前在联合实体和关系提取的若干数据集上的工作高出2.6%的F1分。