Aspect-based sentiment analysis (ABSA) aims at extracting opinionated aspect terms in review texts and determining their sentiment polarities, which is widely studied in both academia and industry. As a fine-grained classification task, the annotation cost is extremely high. Domain adaptation is a popular solution to alleviate the data deficiency issue in new domains by transferring common knowledge across domains. Most cross-domain ABSA studies are based on structure correspondence learning (SCL), and use pivot features to construct auxiliary tasks for narrowing down the gap between domains. However, their pivot-based auxiliary tasks can only transfer knowledge of aspect terms but not sentiment, limiting the performance of existing models. In this work, we propose a novel Syntax-guided Domain Adaptation Model, named SDAM, for more effective cross-domain ABSA. SDAM exploits syntactic structure similarities for building pseudo training instances, during which aspect terms of target domain are explicitly related to sentiment polarities. Besides, we propose a syntax-based BERT mask language model for further capturing domain-invariant features. Finally, to alleviate the sentiment inconsistency issue in multi-gram aspect terms, we introduce a span-based joint aspect term and sentiment analysis module into the cross-domain End2End ABSA. Experiments on five benchmark datasets show that our model consistently outperforms the state-of-the-art baselines with respect to Micro-F1 metric for the cross-domain End2End ABSA task.
翻译:以外观为基础的情绪分析(ABSA)旨在从审查文本和确定其感知极点中提取观点方面的术语,这是学术界和工业界广泛研究的。作为细微分类任务,批注成本极高。域适应是一个受欢迎的解决方案,通过在新领域之间转让共同知识,缓解新领域的数据短缺问题。大多数跨面ABSA研究以结构对应学习(SCL)为基础,并使用枢纽特征构建缩小区域间差距的辅助任务。然而,基于主轴的辅助任务只能转让对方术语的知识,而不是情绪,限制现有模型的性能。在这项工作中,我们提出了一个名为SDAM的新型语法指导多域适应模型,以更有效地跨面的 ABSA 。SDAM利用了建立假培训实例的合成结构相似性结构,在此期间,目标领域的条款明确与情绪极性模型相关。此外,我们提议一个基于合成税的 BERT 掩码语言模型,以进一步捕捉域内变量,而不是情绪,限制现有模型的性能。最后,我们提出了一个名为SDA-BA 直径BA 的内基级基准术语,以持续的情感-BS-BS-BS-BA 实验模型的跨面分析,以显示多面BS-BS-BS-BS-BS-BS-BS-BS-BS-BS-BS-BS-B-B-B-B-BS-B-B-B-B-B-B-B-B-B-B-B-B-BS-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-B-