We describe Sockeye (version 1.12), an open-source sequence-to-sequence toolkit for Neural Machine Translation (NMT). Sockeye is a production-ready framework for training and applying models as well as an experimental platform for researchers. Written in Python and built on MXNet, the toolkit offers scalable training and inference for the three most prominent encoder-decoder architectures: attentional recurrent neural networks, self-attentional transformers, and fully convolutional networks. Sockeye also supports a wide range of optimizers, normalization and regularization techniques, and inference improvements from current NMT literature. Users can easily run standard training recipes, explore different model settings, and incorporate new ideas. In this paper, we highlight Sockeye's features and benchmark it against other NMT toolkits on two language arcs from the 2017 Conference on Machine Translation (WMT): English-German and Latvian-English. We report competitive BLEU scores across all three architectures, including an overall best score for Sockeye's transformer implementation. To facilitate further comparison, we release all system outputs and training scripts used in our experiments. The Sockeye toolkit is free software released under the Apache 2.0 license.
翻译:我们描述Sockeye (版本 1.12) 是一个开放源码序列到序列的神经机器翻译工具包(NMT) 。 Sockeye 是一个为培训和应用模型以及研究人员实验平台做好生产准备的框架。 在 Python 编写并建在 MXNet 上,该工具包为三种最著名的编码脱coder 结构提供了可缩放的培训和推断: 注意的经常性神经网络、 自我注意的变异器 和完全进化的网络 。 Sockey 还支持一系列广泛的优化器、 正常化和正规化技术, 以及来自当前NMT文献的推断改进 。 用户可以轻松运行标准的培训食谱, 探索不同的模型设置, 并纳入新想法 。 在本文中, 我们强调Sockeye的特征, 并根据2017年机器翻译会议(WMT) 的两种语言弧线(英语- 德语 和 拉脱维亚语- 英语 ), 对照其他NMT工具包。 我们报告所有三种结构的竞争性 BLEU评分, 包括索克变异器实施的总体最佳评分 。 为了进一步比较, 我们使用所有版本的版本的版本的版本的版本的版本和版本的版本的版本。