Sentence Ordering refers to the task of rearranging a set of sentences into the appropriate coherent order. For this task, most previous approaches have explored global context-based end-to-end methods using Sequence Generation techniques. In this paper, we put forward a set of robust local and global context-based pairwise ordering strategies, leveraging which our prediction strategies outperform all previous works in this domain. Our proposed encoding method utilizes the paragraph's rich global contextual information to predict the pairwise order using novel transformer architectures. Analysis of the two proposed decoding strategies helps better explain error propagation in pairwise models. This approach is the most accurate pure pairwise model and our encoding strategy also significantly improves the performance of other recent approaches that use pairwise models, including the previous state-of-the-art, demonstrating the research novelty and generalizability of this work. Additionally, we show how the pre-training task for ALBERT helps it to significantly outperform BERT, despite having considerably lesser parameters. The extensive experimental results, architectural analysis and ablation studies demonstrate the effectiveness and superiority of the proposed models compared to the previous state-of-the-art, besides providing a much better understanding of the functioning of pairwise models.
翻译:句号顺序是指将一组句子重新排列为适当一致顺序的任务。 对于这项任务,大多数先前的方法都探索了使用序列生成技术的全球背景端对端方法。 在本文件中,我们提出了一套强有力的本地和全球背景对齐排序战略,利用这些战略,我们的预测战略优于该领域以前的所有工作。我们提议的编码方法利用该段丰富的全球背景信息,利用新的变压器结构预测对齐顺序。对两个拟议解码战略的分析有助于更好地解释对称模型中的错误传播。这个方法是最精确的纯对对对对式模型,我们的编码战略也大大改进了使用对齐模型的其他近期方法的性能,包括以前的状态技术,展示了这项工作的新颖性和可概括性。此外,我们展示了ALBERT的培训前任务如何帮助它大大超越了对齐的布尔特,尽管参数要少得多。广泛的实验结果、建筑分析和调整研究显示了拟议模型相对于前一州模式运行的更好理解。