Non-autoregressive mechanisms can significantly decrease inference time for speech transformers, especially when the single step variant is applied. Previous work on CTC alignment-based single step non-autoregressive transformer (CASS-NAT) has shown a large real time factor (RTF) improvement over autoregressive transformers (AT). In this work, we propose several methods to improve the accuracy of the end-to-end CASS-NAT, followed by performance analyses. First, convolution augmented self-attention blocks are applied to both the encoder and decoder modules. Second, we propose to expand the trigger mask (acoustic boundary) for each token to increase the robustness of CTC alignments. In addition, iterated loss functions are used to enhance the gradient update of low-layer parameters. Without using an external language model, the WERs of the improved CASS-NAT, when using the three methods, are 3.1%/7.2% on Librispeech test clean/other sets and the CER is 5.4% on the Aishell1 test set, achieving a 7%~21% relative WER/CER improvement. For the analyses, we plot attention weight distributions in the decoders to visualize the relationships between token-level acoustic embeddings. When the acoustic embeddings are visualized, we find that they have a similar behavior to word embeddings, which explains why the improved CASS-NAT performs similarly to AT.


翻译:不偏向机制可以大幅降低语音变压器的推导时间, 特别是在应用单步变异器时。 之前关于 CTC 校正基单步不偏向变压器( CASS- NAT) 的工作已经显示, 相对于自动递增变压器( AT), 其改进是巨大的实时因子( RTF) 。 在这项工作中, 我们建议了几种方法来提高 CASS- NAT 端到端的准确性, 并随后进行绩效分析。 首先, 对编码器和解码器模块都应用了 convoluction 增强自我注意区块。 其次, 我们提议扩大每个符号的触发面罩( 声波边界), 以提高 CASS- NAT 的稳重性调整。 在SBAR 测试集中, 我们的SBSBS- 递增到SBISARC 的SBSDR 。 当我们找到 AS AS AS- IMVAL IML 的SDL IMVAL AS MAL MAL MAL MAL 时, MAL MALS MAL MAL MAL MAL MALS MAL MAL MALS MADS MAL MAL MANS MADS MADS MADS MA MADS MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA

0
下载
关闭预览

相关内容

最新《Transformers模型》教程,64页ppt
专知会员服务
306+阅读 · 2020年11月26日
已删除
将门创投
3+阅读 · 2017年9月12日
Neural Speech Synthesis with Transformer Network
Arxiv
5+阅读 · 2019年1月30日
Arxiv
3+阅读 · 2018年11月13日
VIP会员
相关资讯
已删除
将门创投
3+阅读 · 2017年9月12日
Top
微信扫码咨询专知VIP会员