Text sentiment analysis, also known as opinion mining, is research on the calculation of people's views, evaluations, attitude and emotions expressed by entities. Text sentiment analysis can be divided into text-level sentiment analysis, sen-tence-level sentiment analysis and aspect-level sentiment analysis. Aspect-Based Sentiment Analysis (ABSA) is a fine-grained task in the field of sentiment analysis, which aims to predict the polarity of aspects. The research of pre-training neural model has significantly improved the performance of many natural language processing tasks. In recent years, pre training model (PTM) has been applied in ABSA. Therefore, there has been a question, which is whether PTMs contain sufficient syntactic information for ABSA. In this paper, we explored the recent DeBERTa model (Decoding-enhanced BERT with disentangled attention) to solve Aspect-Based Sentiment Analysis problem. DeBERTa is a kind of neural language model based on transformer, which uses self-supervised learning to pre-train on a large number of original text corpora. Based on the Local Context Focus (LCF) mechanism, by integrating DeBERTa model, we purpose a multi-task learning model for aspect-based sentiment analysis. The experiments result on the most commonly used the laptop and restaurant datasets of SemEval-2014 and the ACL twitter dataset show that LCF mechanism with DeBERTa has significant improvement.
翻译:语言感知分析(ABSA)是情感分析领域的细微任务,目的是预测不同方面的极性。培训前神经模型的研究极大地改善了许多自然语言处理任务的业绩。近年来,在ABSA中应用了预先培训模式(PTM ) 。因此,有一个问题,这就是PTM 是否包含对ABSA的足够合成信息。在本文中,我们探索了最近的DeBERTA模型(Depoting-enchaned BERT, 注意力不集中),以解决基于情感分析的极性。培训前神经模型的研究极大地改善了许多自然语言处理任务的绩效。DeBERTA是一种基于变异器的神经语言模型,它使用自我控制学习到对ABSA的预感知。因此,有一个问题,这就是PTM 是否包含对ABSA 的足够同步感知分析。我们探索了最近的DeBERTA模型(Decoding-encal BER) 模型和基于SLOLA的原始数据分析结果。我们使用最大幅度的SLAF数据库模型, 用于CRLA的原始数据分析。