Abstract Meaning Representation (AMR) parsing aims to translate sentences to semantic representation with a hierarchical structure, and is recently empowered by pretrained sequence-to-sequence models. However, there exists a gap between their flat training objective (i.e., equally treats all output tokens) and the hierarchical AMR structure, which limits the model generalization. To bridge this gap, we propose a Hierarchical Curriculum Learning (HCL) framework with Structure-level (SC) and Instance-level Curricula (IC). SC switches progressively from core to detail AMR semantic elements while IC transits from structure-simple to -complex AMR instances during training. Through these two warming-up processes, HCL reduces the difficulty of learning complex structures, thus the flat model can better adapt to the AMR hierarchy. Extensive experiments on AMR2.0, AMR3.0, structure-complex and out-of-distribution situations verify the effectiveness of HCL.
翻译:抽象含义代表(AMR)分析旨在将句子转换为具有等级结构的语义代表,最近通过预先培训的顺序到顺序模式获得授权,然而,在他们的平板培训目标(即同等对待所有产出符号)与等级的AMR结构之间存在差距,这限制了模式的概括化。为了缩小这一差距,我们提议了一个具有结构层次和实点层次课程的等级课程学习框架(HCL),SC从核心逐渐转换为详细的AMR语义要素,而IC在培训期间从结构简单到复杂的AMR案例,HLL通过这两个变暖过程减少了学习复杂结构的难度,因此,平板模型可以更好地适应AMR等级。关于AMR2.0、AMR3.0、结构组合和分配以外的情况的广泛实验可以核实高标准分类的有效性。