In aspect-level sentiment classification (ASC), state-of-the-art models encode either syntax graph or relation graph to capture the local syntactic information or global relational information. Despite the advantages of syntax and relation graphs, they have respective shortages which are neglected, limiting the representation power in the graph modeling process. To resolve their limitations, we design a novel local-global interactive graph, which marries their advantages by stitching the two graphs via interactive edges. To model this local-global interactive graph, we propose a novel neural network termed DigNet, whose core module is the stacked local-global interactive (LGI) layers performing two processes: intra-graph message passing and cross-graph message passing. In this way, the local syntactic and global relational information can be reconciled as a whole in understanding the aspect-level sentiment. Concretely, we design two variants of local-global interactive graphs with different kinds of interactive edges and three variants of LGI layers. We conduct experiments on several public benchmark datasets and the results show that we outperform previous best scores by 3\%, 2.32\%, and 6.33\% in terms of Macro-F1 on Lap14, Res14, and Res15 datasets, respectively, confirming the effectiveness and superiority of the proposed local-global interactive graph and DigNet.
翻译:在方位情绪分类(ASC)中,最先进的模型将语法图或关系图编码成,以捕捉本地合成信息或全球关系信息。尽管语法和关系图有其优点,但它们各自都有被忽略的短缺,限制了图形建模过程中的代表力量。为了解决其局限性,我们设计了一个全新的本地-全球互动图,通过交互边缘缝合两张图表,将两者的优势化为新颖的本地-全球互动图。为模拟这个本地-全球互动图,我们提议建立一个名为DigNet的新颖的神经网络,其核心模块是堆叠的本地-全球互动(LGI)层,进行两个过程:即内部电报和交叉电报传递。以这种方式,本地合成和全球关系信息可以整体调和,了解方位情绪。具体地说,我们设计了两种地方-全球互动图的变式,不同互动边和LGIGI层的三种变式。我们进行了几个公共基准数据集的实验,结果显示我们分别用3+、2-314、Res-33和Res-Rapal-GPal 3、2.15和Res-rage Statal 和Res-ral-rage Statal 的比数分别比前最好的排名。