In this paper, we propose second-order graph-based neural dependency parsing using message passing and end-to-end neural networks. We empirically show that our approaches match the accuracy of very recent state-of-the-art second-order graph-based neural dependency parsers and have significantly faster speed in both training and testing. We also empirically show the advantage of second-order parsing over first-order parsing and observe that the usefulness of the head-selection structured constraint vanishes when using BERT embedding.
翻译:在本文中,我们建议使用电文传递和端到端神经网络进行二阶图形神经依赖性分析。我们从经验上表明,我们的方法与最近最先进的二阶图形神经依赖性剖析器的准确性相匹配,在培训和测试中速度都大大加快。我们从经验上还表明,二阶对端对端对端分析的优势,并且指出,在使用BERT嵌入时,头选结构制约的效用会消失。