The deep neural networks, such as the Deep-FSMN, have been widely studied for keyword spotting (KWS) applications. However, computational resources for these networks are significantly constrained since they usually run on-call on edge devices. In this paper, we present BiFSMN, an accurate and extreme-efficient binary neural network for KWS. We first construct a High-frequency Enhancement Distillation scheme for the binarization-aware training, which emphasizes the high-frequency information from the full-precision network's representation that is more crucial for the optimization of the binarized network. Then, to allow the instant and adaptive accuracy-efficiency trade-offs at runtime, we also propose a Thinnable Binarization Architecture to further liberate the acceleration potential of the binarized network from the topology perspective. Moreover, we implement a Fast Bitwise Computation Kernel for BiFSMN on ARMv8 devices which fully utilizes registers and increases instruction throughput to push the limit of deployment efficiency. Extensive experiments show that BiFSMN outperforms existing binarization methods by convincing margins on various datasets and is even comparable with the full-precision counterpart (e.g., less than 3% drop on Speech Commands V1-12). We highlight that benefiting from the thinnable architecture and the optimized 1-bit implementation, BiFSMN can achieve an impressive 22.3x speedup and 15.5x storage-saving on real-world edge hardware. Our code is released at https://github.com/htqin/BiFSMN.
翻译:深 FSMN 等深神经网络已被广泛研究用于关键词检测( KWS) 应用。 然而, 这些网络的计算资源由于通常在边缘设备上运行,因此受到极大限制。 在本文件中,我们介绍了KWS的精确和极高效的二进神经网络BIFSMN。 我们首先为二进制-觉悟培训建立一个高频增强蒸馏计划,该计划强调全精度网络代表中的高频信息,这对于优化二进化网络更为关键。 然后,为了使这些网络的计算资源在运行时能够实现即时和适应的准确效率交易。 我们还提议建立一个可感知的BIFSMNN, 从表面角度进一步释放二进制网络的加速潜力。 此外,我们在 ARMVV8 设备上为BIFSMNN 快速进行快速的调试计算,该计划充分利用了登记册,提高了通过传输指令来提高部署效率的限度。 广泛实验显示, BFSMMNFS 超越了现有的双进化方法, 3 在各种数据存储/ IMS- slippilentreal IMS- sliforal 上, 25Siblish- slipplex- slipplex- slieval 25) 和 Weal- sildal- sildal- silviewmal- silviews- sildal- sildal- sildal- sildal- silviewildal- silviewmilmildildildildildride.