Aspect-based sentiment analysis (ABSA) typically requires in-domain annotated data for supervised training/fine-tuning. It is a big challenge to scale ABSA to a large number of new domains. This paper aims to train a unified model that can perform zero-shot ABSA without using any annotated data for a new domain. We propose a method called contrastive post-training on review Natural Language Inference (CORN). Later ABSA tasks can be cast into NLI for zero-shot transfer. We evaluate CORN on ABSA tasks, ranging from aspect extraction (AE), aspect sentiment classification (ASC), to end-to-end aspect-based sentiment analysis (E2E ABSA), which show ABSA can be conducted without any human annotated ABSA data.
翻译:基于外观的情绪分析(ABSA)通常需要内部附加说明的数据,用于有监督的培训/调整,这是将ABSA推广到大量新领域的一大挑战,本文件旨在培训一个统一模型,不使用任何附加说明的数据进行新领域的无效果的ABSA。我们提出了一种方法,称为关于审查自然语言推论的对比式后培训。以后,ABSA的任务可以投到国家语言分析中,用于零发转让。我们评估ABSA任务的CORN,从方面提取(AE)、方面情绪分类(ASC)、端对端情绪分析(E2E ABSA)到端对端情绪分析(E2E ABSA)不等,这表明ABSA可以在没有人类附加说明的ABSA数据的情况下进行。