The recently proposed Conformer model has become the de facto backbone model for various downstream speech tasks based on its hybrid attention-convolution architecture that captures both local and global features. However, through a series of systematic studies, we find that the Conformer architecture's design choices are not optimal. After re-examining the design choices for both the macro and micro-architecture of Conformer, we propose Squeezeformer which consistently outperforms the state-of-the-art ASR models under the same training schemes. In particular, for the macro-architecture, Squeezeformer incorporates (i) the Temporal U-Net structure which reduces the cost of the multi-head attention modules on long sequences, and (ii) a simpler block structure of multi-head attention or convolution modules followed up by feed-forward module instead of the Macaron structure proposed in Conformer. Furthermore, for the micro-architecture, Squeezeformer (i) simplifies the activations in the convolutional block, (ii) removes redundant Layer Normalization operations, and (iii) incorporates an efficient depthwise down-sampling layer to efficiently sub-sample the input signal. Squeezeformer achieves state-of-the-art results of 7.5%, 6.5%, and 6.0% word-error-rate (WER) on LibriSpeech test-other without external language models, which are 3.1%, 1.4%, and 0.6% better than Conformer-CTC with the same number of FLOPs. Our code is open-sourced and available online.
翻译:最近提议的 Confred 模式已成为基于其混合的注意力-革命结构的下游演讲任务的实际主干模式,该模式同时捕捉了当地和全球特点。然而,通过一系列系统研究,我们发现 Confred 结构的设计选择并不最佳。在重新审查了对 Confred 宏观和微观结构的设计选择之后,我们提议Squezeerf 模式,该模式在相同的培训计划下始终优于最先进的 ASR 模式。特别是,对于宏观结构结构,Squezeerut 包含(一) Temal U-Net结构,该结构降低了长序列多头关注模块的成本,以及(二) 多头关注或演动模块的更简单区块结构,随后是进料-向模块,而不是 Conforonder 结构。此外,对于微结构,Squezeerecter(i) 简化了在变压块的外部定义,(二) 去除了冗余层的 Unal Unal-ral-ral-ral-ral-ral-ral-rest-r-ral-ral-ral-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l