Non-autoregressive (NAR) transformer models have been studied intensively in automatic speech recognition (ASR), and a substantial part of NAR transformer models is to use the casual mask to limit token dependencies. However, the casual mask is designed for the left-to-right decoding process of the non-parallel autoregressive (AR) transformer, which is inappropriate for the parallel NAR transformer since it ignores the right-to-left contexts. Some models are proposed to utilize right-to-left contexts with an extra decoder, but these methods increase the model complexity. To tackle the above problems, we propose a new non-autoregressive transformer with a unified bidirectional decoder (NAT-UBD), which can simultaneously utilize left-to-right and right-to-left contexts. However, direct use of bidirectional contexts will cause information leakage, which means the decoder output can be affected by the character information from the input of the same position. To avoid information leakage, we propose a novel attention mask and modify vanilla queries, keys, and values matrices for NAT-UBD. Experimental results verify that NAT-UBD can achieve character error rates (CERs) of 5.0%/5.5% on the Aishell1 dev/test sets, outperforming all previous NAR transformer models. Moreover, NAT-UBD can run 49.8x faster than the AR transformer baseline when decoding in a single step.
翻译:在自动语音识别(ASR)中,对非倾斜式变压器模型进行了深入的研究,NAR变压器模型的很大一部分是使用临时掩码来限制象征性依赖性。然而,为非双向自动回归(AR)变压器的左对右解码过程设计了临时掩码,这对平行的NAR变压器来说是不合适的,因为它忽略了右对左环境。一些模型建议使用带有额外解码器的右对左环境,但这些方法增加了模型的复杂性。为了解决上述问题,我们提议了一个新的非倾斜式变压器,配有统一的双向解码器(NAT-UBD),该变压器可以同时使用左对右和右对左环境。但是,直接使用双向变压式变压器将造成信息渗漏,这意味着解码输出可能受到来自同一位置输入的特性信息的影响。为了避免信息渗漏,我们提议了一个新的关注面罩,并修改Vanilla 查询、键、N-DAVAL 递值的递定式变压式变压式变压式变压式模型时,NAT-AAT-AVAL1的变压式变压式变压器可以使NAT-AVAL-A-AVAL-AVAL1的变压结果。