In recent years much effort has been devoted to applying neural models to the task of natural language generation. The challenge is to generate natural human-like text, and to control the generation process. This paper presents a task-agnostic survey of recent advances in neural text generation. These advances have been achieved by numerous developments, which we group under the following four headings: data construction, neural frameworks, training and inference strategies, and evaluation metrics. Finally we discuss the future directions for the development of neural text generation including neural pipelines and exploiting back-ground knowledge.
翻译:近年来,在自然语言生成任务中应用神经模型方面做出了大量努力,面临的挑战是产生自然的人类文本,并控制生成过程,本文件对神经文本生成的近期进展进行了任务不可知性调查,这些进展是通过许多发展取得的,我们将其分为以下四个标题:数据构建、神经框架、培训和推断战略以及评估指标。最后,我们讨论了神经文本生成的未来发展方向,包括神经管道和利用后地知识。