It is appealing but challenging to achieve real-time deep neural network (DNN) inference on mobile devices because even the powerful modern mobile devices are considered as ``resource-constrained'' when executing large-scale DNNs. It necessitates the sparse model inference via weight pruning, i.e., DNN weight sparsity, and it is desirable to design a new DNN weight sparsity scheme that can facilitate real-time inference on mobile devices while preserving a high sparse model accuracy. This paper designs a novel mobile inference acceleration framework GRIM that is General to both convolutional neural networks (CNNs) and recurrent neural networks (RNNs) and that achieves Real-time execution and high accuracy, leveraging fine-grained structured sparse model Inference and compiler optimizations for Mobiles. We start by proposing a new fine-grained structured sparsity scheme through the Block-based Column-Row (BCR) pruning. Based on this new fine-grained structured sparsity, our GRIM framework consists of two parts: (a) the compiler optimization and code generation for real-time mobile inference; and (b) the BCR pruning optimizations for determining pruning hyperparameters and performing weight pruning. We compare GRIM with Alibaba MNN, TVM, TensorFlow-Lite, a sparse implementation based on CSR, PatDNN, and ESE (a representative FPGA inference acceleration framework for RNNs), and achieve up to 14.08x speedup.
翻译:在移动设备上实现实时深度神经网络(DNN)的推断是令人感动但具有挑战性的,因为即使强大的现代移动设备在执行大型 DNN 时也被视为“资源受限制”的“资源受限制” 。它需要通过重量裁剪(即DNN 重量宽度)进行稀少的模式推导,并且有必要设计一个新的 DNN 重量偏重计划,它既能促进移动设备实时推断,同时又能保持高度分散的模型精确度。本文设计了一个新的移动加速加速框架GRIM(GRIM ), 它既可被视为“资源受限制”,也可被视为“经常神经网络 ” 。它需要通过重力裁剪裁(即DNNNNN) 来利用微小的结构稀释模型推力和对移动的优化。我们首先提出一个新的精细度结构缩结构突触控系统(BCRR), 以新的精细结构加速框架为基础, 我们的GRIM 和S-R-R-S-S-S-Siral Syal Syal 框架 来决定 运行(一个实际的CR-CR-CR-CR-CR) 和Sir-Sir-Syal-Syal-Siral-Syal-CR-Syal-Syal 和SIM 和Sir 和Sirut 的硬度框架, 和Sir-Syal-Sir-B-Sir 实现两个部分;我们 和B-CRB-CRM-CR-CR-CR-CR-CRM-CRM-CRDR-CR-CR-CR-CR-CR-C-S-S-C-C-CR-CR-S-S-S-S-S-S-CR-CR-CR-Sir-Sir-Sir-CR-CR-CR-CR-Sir-CR-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-C-C-C-