Eliciting knowledge from pre-trained language models via prompt-based learning has shown great potential in many natural language processing tasks. Whereas, the applications for more complex tasks such as event extraction are less studied since the design of prompt is not straightforward for the structured event containing various triggers and arguments. % Meanwhile, current conditional generation methods employ large encoder-decoder models, which are costly to train and serve. In this paper, we present a novel prompt-based approach, which elicits both the independent and joint knowledge about different events for event argument extraction. The experimental results on the benchmark ACE2005 dataset show the great advantages of our proposed approach. In particular, our approach is superior to the recent advanced methods in both fully-supervised and low-resource scenarios.
翻译:通过速成学习,从受过训练的语言模型中获取知识,这在许多自然语言处理任务中显示出巨大的潜力。虽然对事件提取等更复杂任务的应用研究较少,因为对包含各种触发因素和论据的结构化事件而言,快速设计并非直截了当。% 同时,目前的有条件发电方法采用大型编码器解码器模型,这些模型对培训和服务来说成本很高。在本文件中,我们提出了一个新的快速方法,它既能吸引独立和共同了解不同事件,又能吸引事件论证。关于ACE2005数据集基准的实验结果显示了我们拟议方法的巨大优势。特别是,我们的方法优于在完全监督下和低资源情景下的最新先进方法。