In recent years, a number of keyphrase generation (KPG) approaches were proposed consisting of complex model architectures, dedicated training paradigms and decoding strategies. In this work, we opt for simplicity and show how a commonly used seq2seq language model, BART, can be easily adapted to generate keyphrases from the text in a single batch computation using a simple training procedure. Empirical results on five benchmarks show that our approach is as good as the existing state-of-the-art KPG systems, but using a much simpler and easy to deploy framework.
翻译:近年来,提出了一些关键词生成(KPG)方法,其中包括复杂的模型结构、专门的培训范式和解码战略。在这项工作中,我们选择了简单化,并展示了一种常用的后继2seq语言模式(BART)如何能够很容易地调整,以便使用简单的培训程序,在单批计算中生成文本的关键词。 五个基准的经验结果表明,我们的方法与现有最先进的KPG系统一样好,但使用一个简单易用的框架。