Recently, attention-based encoder-decoder (AED) models have shown state-of-the-art performance in automatic speech recognition (ASR). As the original AED models with global attentions are not capable of online inference, various online attention schemes have been developed to reduce ASR latency for better user experience. However, a common limitation of the conventional softmax-based online attention approaches is that they introduce an additional hyperparameter related to the length of the attention window, requiring multiple trials of model training for tuning the hyperparameter. In order to deal with this problem, we propose a novel softmax-free attention method and its modified formulation for online attention, which does not need any additional hyperparameter at the training phase. Through a number of ASR experiments, we demonstrate the tradeoff between the latency and performance of the proposed online attention technique can be controlled by merely adjusting a threshold at the test phase. Furthermore, the proposed methods showed competitive performance to the conventional global and online attentions in terms of word-error-rates (WERs).
翻译:最近,基于关注的编码器-编码器(AED)模型在自动语音识别(ASR)中表现出了最先进的性能。由于最初的全球关注的AED模型无法进行在线推断,已经开发了各种在线关注计划,以减少ASR的延迟度,以便更好的用户经验。然而,传统软式软式在线关注方法的一个共同限制是,它们引入了与关注窗口长度有关的一个额外的超参数,要求为调整超参数进行多次示范培训。为了解决这一问题,我们提出了一种新的软式无关注方法及其经修改的在线关注配方,这在培训阶段不需要额外的超参数。通过一些ASR实验,我们通过仅仅调整测试阶段的阈值就可以控制拟议在线关注技术的延迟度和性能之间的权衡。此外,拟议方法显示,在语言率(WERs)方面,常规全球和在线关注的竞争性表现。