Using task-specific pre-training and leveraging cross-lingual transfer are two of the most popular ways to handle code-switched data. In this paper, we aim to compare the effects of both for the task of sentiment analysis. We work with two Dravidian Code-Switched languages - Tamil-Engish and Malayalam-English and four different BERT based models. We compare the effects of task-specific pre-training and cross-lingual transfer and find that task-specific pre-training results in superior zero-shot and supervised performance when compared to performance achieved by leveraging cross-lingual transfer from multilingual BERT models.
翻译:使用具体任务的培训前培训和跨语文转让是处理代码转换数据最常用的两种方式:在本文件中,我们力求比较这两种方式对情绪分析任务的影响;我们使用两种德拉维迪语(泰米尔英语和马来亚拉姆英语)和四种不同的BERT模式开展工作;我们比较具体任务培训前和跨语文转让的影响,发现与利用多语种BERT模式的跨语文转让取得的业绩相比,具体任务培训前工作的结果是优于零效果和受监督业绩。