Generating natural language requires conveying content in an appropriate style. We explore two related tasks on generating text of varying formality: monolingual formality transfer and formality-sensitive machine translation. We propose to solve these tasks jointly using multi-task learning, and show that our models achieve state-of-the-art performance for formality transfer and are able to perform formality-sensitive translation without being explicitly trained on style-annotated translation examples.
翻译:生成自然语言需要以适当的风格传递内容。 我们探索了产生不同形式文本的两个相关任务:单一语言形式转让和对形式敏感的机器翻译。 我们提议使用多任务学习来共同解决这些任务,并表明我们的模型在形式转让方面达到了最先进的表现,能够进行对形式敏感的翻译,而没有经过关于风格注释化翻译示例的明确培训。