The University of Cambridge submission to the WMT18 news translation task focuses on the combination of diverse models of translation. We compare recurrent, convolutional, and self-attention-based neural models on German-English, English-German, and Chinese-English. Our final system combines all neural models together with a phrase-based SMT system in an MBR-based scheme. We report small but consistent gains on top of strong Transformer ensembles.
翻译:剑桥大学向WMT18新闻翻译任务提交的论文侧重于多种翻译模式的组合。 我们比较了德语-英语、英语-德语和中文-英语的反复、进化和自我关注的神经模型。 我们的最终系统将所有神经模型与基于词汇的 SMT 系统结合到一个基于MCR 的计划中。 我们报告在强大的变异器组合上取得了微小但一致的收益。