The development of large and super-large language models, such as GPT-3, T5, Switch Transformer, ERNIE, etc., has significantly improved the performance of text generation. One of the important research directions in this area is the generation of texts with arguments. The solution of this problem can be used in business meetings, political debates, dialogue systems, for preparation of student essays. One of the main domains for these applications is the economic sphere. The key problem of the argument text generation for the Russian language is the lack of annotated argumentation corpora. In this paper, we use translated versions of the Argumentative Microtext, Persuasive Essays and UKP Sentential corpora to fine-tune RuBERT model. Further, this model is used to annotate the corpus of economic news by argumentation. Then the annotated corpus is employed to fine-tune the ruGPT-3 model, which generates argument texts. The results show that this approach improves the accuracy of the argument generation by more than 20 percentage points (63.2\% vs. 42.5\%) compared to the original ruGPT-3 model.
翻译:开发大型和超大型语言模型,如GPT-3、T5、开关变换器、ERNIE等,大大提高了文本生成的绩效。这一领域的重要研究方向之一是生成带有论据的文本。这个问题的解决方案可以在商业会议、政治辩论、对话系统、学生论文编写过程中使用。这些应用的主要领域之一是经济领域。为俄语生成引文的关键问题是缺少附加说明的引文公司。在本文中,我们使用具有说服力的微文本、 Persuasive Essays 和 UKP Sententential Corpora 的翻译版本来微调RuBERT 模型。此外,这一模型还用来通过论证来批注经济新闻。然后,附加说明的文集被用来微调产生引文的RuPT-3模型。结果显示,与原始的GPT-3模型相比,这一方法提高了引文的准确度,超过20个百分点(63.2 ⁇ v. 42.5 ⁇ )。