Consonant and vowel reduction are often encountered in Uyghur speech, which might cause performance degradation in Uyghur automatic speech recognition (ASR). Our recently proposed learning strategy based on masking, Phone Masking Training (PMT), alleviates the impact of such phenomenon in Uyghur ASR. Although PMT achieves remarkably improvements, there still exists room for further gains due to the granularity mismatch between masking unit of PMT (phoneme) and modeling unit (word-piece). To boost the performance of PMT, we propose multi-modeling unit training (MMUT) architecture fusion with PMT (PM-MMUT). The idea of MMUT framework is to split the Encoder into two parts including acoustic feature sequences to phoneme-level representation (AF-to-PLR) and phoneme-level representation to word-piece-level representation (PLR-to-WPLR). It allows AF-to-PLR to be optimized by an intermediate phoneme-based CTC loss to learn the rich phoneme-level context information brought by PMT. Experi-mental results on Uyghur ASR show that the proposed approaches improve significantly, outperforming the pure PMT (reduction WER from 24.0 to 23.7 on Read-Test and from 38.4 to 36.8 on Oral-Test respectively). We also conduct experiments on the 960-hour Librispeech benchmark using ESPnet1, which achieves about 10% relative WER reduction on all the test sets without LM fusion comparing with the latest official ESPnet1 pre-trained model.
翻译:维吾尔语言中往往会遇到调和和格言的减少,这可能导致Uyghur自动语音识别(ASR)的性能退化。我们最近提出的基于遮罩、电话遮罩培训(PMT)的学习战略(PMT)减轻了这种现象在Uyghur ASR中的影响。虽然PMT取得了显著的改善,但由于PMT(电话)和模型股(字片)的遮罩单元之间有颗粒性不匹配,仍有进一步收益的余地。为了提高PMT的性能,我们提议多模范单位培训(MMMUT)与PMT(P-MUT)的混合结构。MUMUT框架的构想是将Ecoder分为两个部分,包括声学功能序列与电话级代表(AF-PLR)和电话级代表(PLR-WPLR)之间的不匹配。它使AF-PLRLR能够通过中间的手机损失来优化MT(MT) 与PMT(MT(MT-MUMUMU-MT) (MT) (MUPer-S-S-R) 4) RE-S-S-S-SER-S-S-Seral-Seral-Seral-Servil) 10 Scial-S-S-S-S-S-Serviolviolental-S-S-S-SB-Syal AS-S-S-Syal-L) 的升级方法分别化, 10:Acal-Syal-S-S-S-S-S-B-S-S-S-S-S-S-S-S-S-S-S-S-S-B-L-S-S-S-S-S-S-S-S-SB-Sir-Sir-L-SB-Sir-Sir-Sir-L-L-S-S-S-Sir-S-S-S-S-S-S-S-S-S-S-S-S-L-S-S-S-S-S-Sir-R-R-S-L-S