Recent research on style transfer takes inspiration from unsupervised neural machine translation (UNMT), learning from large amounts of non-parallel data by exploiting cycle consistency loss, back-translation, and denoising autoencoders. By contrast, the use of self-supervised NMT (SSNMT), which leverages (near) parallel instances hidden in non-parallel data more efficiently than UNMT, has not yet been explored for style transfer. In this paper we present a novel Self-Supervised Style Transfer (3ST) model, which augments SSNMT with UNMT methods in order to identify and efficiently exploit supervisory signals in non-parallel social media posts. We compare 3ST with state-of-the-art (SOTA) style transfer models across civil rephrasing, formality and polarity tasks. We show that 3ST is able to balance the three major objectives (fluency, content preservation, attribute transfer accuracy) the best, outperforming SOTA models on averaged performance across their tested tasks in automatic and human evaluation.
翻译:最近对风格传输的研究灵感来自不受监督的神经机器翻译(UNMT),它通过利用周期一致性损失、回译和自译自译自译自译自译自译自译自译自译自查的大量非平行数据。相比之下,利用自我监督的NMT(SSNMT)(SSNMT)(SSNMT)(SSSNMT)(SSSNMT)(SSNMT)(SSSNMT)(SSSSSSS))(它比UNMT(UNMT)(UNMT)(3ST)(SMT)(3SNMT)(SNMT)(SNMT)(NMT)(NSSSNMT)(SSNMT)(SNMT)(NMT)(NSNMT(NSNMT)(NSNMT)(比UNT)(NMT(NMT)(NM(NMT)(NMT)(NMT)(NT)(NNMT(NMT)(NMT)(NMT)(NMT)(NNT)(NT)(NMT)(NMT)(NT)(NMT)(NM)(NM)(NMT)(NT)(NM(NT)(NT)(NT)(NT)(NT)(NT)(NT)(NT)(NT)(NT)(UN(NT)(NMT)(NT)(UN)(N)(NM(UN)(UN)(UN)(UN)(UN)(UN)(UN)(UN)(UN)(UN)(UN)(UN)(NMT)(NMT)(UN)(UN)(UN)(UN)(UN)(UN)(UN(UN)(UN)(UN)(UN)(UN)(UN)(UN)(UN)(UN)(UN)(UN)(UN)(UNT)(UNT)(UN)(UN(UN)