There is a growing interest in investigating what neural NLP models learn about language. A prominent open question is the question of whether or not it is necessary to model hierarchical structure. We present a linguistic investigation of a neural parser adding insights to this question. We look at transitivity and agreement information of auxiliary verb constructions (AVCs) in comparison to finite main verbs (FMVs). This comparison is motivated by theoretical work in dependency grammar and in particular the work of Tesni\`ere (1959) where AVCs and FMVs are both instances of a nucleus, the basic unit of syntax. An AVC is a dissociated nucleus, it consists of at least two words, and an FMV is its non-dissociated counterpart, consisting of exactly one word. We suggest that the representation of AVCs and FMVs should capture similar information. We use diagnostic classifiers to probe agreement and transitivity information in vectors learned by a transition-based neural parser in four typologically different languages. We find that the parser learns different information about AVCs and FMVs if only sequential models (BiLSTMs) are used in the architecture but similar information when a recursive layer is used. We find explanations for why this is the case by looking closely at how information is learned in the network and looking at what happens with different dependency representations of AVCs. We conclude that there may be benefits to using a recursive layer in dependency parsing and that we have not yet found the best way to integrate it in our parsers.
翻译:人们越来越有兴趣调查神经神经NLP模型对语言的学习。一个突出的未决问题是,是否有必要建模等级结构。我们对神经剖析器进行语言调查,增加对这个问题的洞察力。我们查看辅助动词构造(AVCs)的中转性和协议信息,比起有限的主动词(FMVs),这种比较的动机是依赖性语法的理论工作,特别是Tesni ⁇ ⁇ ere(1959年)的理论工作,在那里,AVCs和FMVs既是核心,也是基本合成语法。AVC是一个分离的核心,它至少包含两个字,而FMV是其非异性对应词。我们建议,AVCs和FMVs的表述方式应该包含类似的信息。我们使用诊断性分类器来检测从过渡性神经剖析器中学习到的矢量信息。我们发现,在AVC-C结构中,对于AVC的排序和FMVML的诠释方法,我们只能密切地了解A-LS结构中所使用的信息,而我们使用A-Bal-Bal-Ls的解说,我们用的是,我们在BI-LS-LA-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I