Machine learning on tiny IoT devices based on microcontroller units (MCU) is appealing but challenging: the memory of microcontrollers is 2-3 orders of magnitude smaller even than mobile phones. We propose MCUNet, a framework that jointly designs the efficient neural architecture (TinyNAS) and the lightweight inference engine (TinyEngine), enabling ImageNet-scale inference on microcontrollers. TinyNAS adopts a two-stage neural architecture search approach that first optimizes the search space to fit the resource constraints, then specializes the network architecture in the optimized search space. TinyNAS can automatically handle diverse constraints (i.e.device, latency, energy, memory) under low search costs.TinyNAS is co-designed with TinyEngine, a memory-efficient inference library to expand the search space and fit a larger model. TinyEngine adapts the memory scheduling according to the overall network topology rather than layer-wise optimization, reducing the memory usage by 4.8x, and accelerating the inference by 1.7-3.3x compared to TF-Lite Micro and CMSIS-NN. MCUNet is the first to achieves >70% ImageNet top1 accuracy on an off-the-shelf commercial microcontroller, using 3.5x less SRAM and 5.7x less Flash compared to quantized MobileNetV2 and ResNet-18. On visual&audio wake words tasks, MCUNet achieves state-of-the-art accuracy and runs 2.4-3.4x faster than MobileNetV2 and ProxylessNAS-based solutions with 3.7-4.1x smaller peak SRAM. Our study suggests that the era of always-on tiny machine learning on IoT devices has arrived. Code and models can be found here: https://tinyml.mit.edu.
翻译:在微控制器单位(MCU)的基础上对小型 IOT 设备进行小型 IOT 设备学习,这很有吸引力,但具有挑战性:微控制器的记忆力比移动电话小2-3个数量级。我们提议MCUNet,这是一个联合设计高效神经结构(TinyNAS)和轻量级发酵引擎(TinyENGine)的框架,使微控制器能够进行图像网络规模的推导。TinyNAS采用两阶段神经结构搜索方法,首先优化搜索空间以适应资源限制,然后在优化搜索空间中专门设计网络架构。TinyNAS可以自动处理各种限制(即:device、latency、能源、内存),而搜索成本低。TynyNAS是与TINENYENM-SOMM-SOLSODM-SOLMSOL 和SIMISM-SIMSAL AS ASM-SALM-SALM-SOLTF SOLM-SALM-SOLT SOLM-S-SAT AS IML IML AS IML IMSIM-S-S-SLM-SLM-SAL-S-SLM-SLM-SLM-SLM-SLM-SIMTM-S-S-S-S-S-SM-SM-SM-SM-S-SM-SM-SM-SDM-SM-S-SIMTF IMLTFTFTFTF IMTF IMTF TF TF TF TFTFTFTM-S-SM-S-S-SM-S-S-S-SM-SM-S-S-S-SM-S-S-S-S-S-SM-SDM-SDM-SDM-SDM-SL IMTF IMTF IMT IMTF IMT FL IM IM 上, 上, 上, 上,首次 上比下, 上,以下,以下,以先進進進進進進進進進進