Recent advances in prompt-based learning have shown impressive results on few-shot text classification tasks by using cloze-style language prompts. There have been attempts on prompt-based learning for NER which use manually designed templates to predict entity types. However, these two-step methods may suffer from error propagation (from entity span detection), need to prompt for all possible text spans which is costly, and neglect the interdependency when predicting labels for different spans in a sentence. In this paper, we present a simple demonstration-based learning method for NER, which augments the prompt (learning context) with a few task demonstrations. Such demonstrations help the model learn the task better under low-resource settings and allow for span detection and classification over all tokens jointly. Here, we explore entity-oriented demonstration which selects an appropriate entity example per each entity type, and instance-oriented demonstration which retrieves a similar instance example. Through extensive experiments, we find empirically that showing entity example per each entity type, along with its example sentence, can improve the performance both in in-domain and cross-domain settings by 1-3 F1 score.
翻译:利用凝胶式语言提示,在快速学习方面最近的进展在短短的文本分类任务方面显示了令人印象深刻的成果。在使用人工设计的模板来预测实体类型的情况下,已经尝试过为NER快速学习,但是,这些两步方法可能会受到错误传播的影响(从实体的跨度探测中发现),需要对所有可能的文本跨度迅速进行计算,并且需要在一个句子中预测不同跨度的标签时忽视相互依赖性。在本文中,我们为NER提出了一个简单的示范性学习方法,通过一些任务演示增强快速(学习环境)的快速(学习环境),这些演示有助于模型在低资源环境中更好地学习任务,并允许对所有符号进行跨行探测和分类。在这里,我们探索面向实体的示范,根据每个实体类型选择一个适当的实体实例,以及从实例中找到类似实例的例子。通过广泛的实验,我们发现每个实体类型的实体都显示一个示例,加上其示例,可以提高在1-3 F1分的跨边框环境中和跨边环境环境中的绩效。