In Natural Language Processing (NLP), we often need to extract information from tree topology. Sentence structure can be represented via a dependency tree or a constituency tree structure. For this reason, a variant of LSTMs, named Tree-LSTM, was proposed to work on tree topology. In this paper, we design a generalized attention framework for both dependency and constituency trees by encoding variants of decomposable attention inside a Tree-LSTM cell. We evaluated our models on a semantic relatedness task and achieved notable results compared to Tree-LSTM based methods with no attention as well as other neural and non-neural methods and good results compared to Tree-LSTM based methods with attention.
翻译:在自然语言处理(NLP)中,我们常常需要从树本学中提取信息,句号结构可以通过树依赖性树或选区树结构来表示,因此,提议以树-LSTM为代号的LSTMs变种来研究树本学,在本文中,我们通过在树-LSTM细胞中编码可分解注意的变种,为依赖性树和选区树木设计一个普遍关注框架。我们评估了我们关于语义关联性任务的模型,并取得了显著的成果,与以树-LSTM为基的方法相比,没有引起注意,其他神经和非神经方法以及与以树-LSTM为基础的方法相比,取得了良好的效果。