Recursive Neural Networks (RvNNs), which compose sequences according to their underlying hierarchical syntactic structure, have performed well in several natural language processing tasks compared to similar models without structural biases. However, traditional RvNNs are incapable of inducing the latent structure in a plain text sequence on their own. Several extensions have been proposed to overcome this limitation. Nevertheless, these extensions tend to rely on surrogate gradients or reinforcement learning at the cost of higher bias or variance. In this work, we propose Continuous Recursive Neural Network (CRvNN) as a backpropagation-friendly alternative to address the aforementioned limitations. This is done by incorporating a continuous relaxation to the induced structure. We demonstrate that CRvNN achieves strong performance in challenging synthetic tasks such as logical inference and ListOps. We also show that CRvNN performs comparably or better than prior latent structure models on real-world tasks such as sentiment analysis and natural language inference.
翻译:自然神经网络(RvNN)根据其基本的等级合成结构组成序列,在几种自然语言处理任务中,与没有结构性偏见的类似模型相比,在几种自然语言处理任务中表现良好,但是传统的RvNNN无法单独以简单的文字顺序引导潜在结构。为克服这一限制,提出了若干扩展建议。然而,这些扩展往往以偏差或差异较大为代价,依靠代位梯度或强化学习。在这项工作中,我们提议持续神经网络(CRvNNN)作为处理上述限制的后向回向回向式友好替代方法。这是通过将持续放松引导结构来实现的。我们证明,CrvNN在逻辑推断和ListOps等挑战性合成任务中取得了很强的成绩。我们还表明,CRvNN在情绪分析和自然语言推断等现实世界任务上比先前的潜在结构模型具有可比较或更好的表现。