In the Internet of Things era, where we see many interconnected and heterogeneous mobile and fixed smart devices, distributing the intelligence from the cloud to the edge has become a necessity. Due to limited computational and communication capabilities, low memory and limited energy budget, bringing artificial intelligence algorithms to peripheral devices, such as the end-nodes of a sensor network, is a challenging task and requires the design of innovative methods. In this work, we present PhiNets, a new scalable backbone optimized for deep-learning-based image processing on resource-constrained platforms. PhiNets are based on inverted residual blocks specifically designed to decouple the computational cost, working memory, and parameter memory, thus exploiting all the available resources. With a YoloV2 detection head and Simple Online and Realtime Tracking, the proposed architecture has achieved the state-of-the-art results in (i) detection on the COCO and VOC2012 benchmarks, and (ii) tracking on the MOT15 benchmark. PhiNets reduce the parameter count of 87% to 93% with respect to previous state-of-the-art models (EfficientNetv1, MobileNetv2) and achieve better performance with lower computational cost. Moreover, we demonstrate our approach on a prototype node based on a STM32H743 microcontroller (MCU) with 2MB of internal Flash and 1MB of RAM and achieve power requirements in the order of 10 mW. The code for the PhiNets is publicly available on GitHub.
翻译:由于计算和通信能力有限,记忆力低,能源预算有限,将人工智能算法带到边缘设备,例如传感器网络的终端点,这是一项艰巨的任务,需要设计创新方法。在这项工作中,我们介绍PhiNets,这是一个新的可扩缩的骨干,在资源限制的平台上最优化的基于深学习的图像处理。PhiNets基于专门设计用来调和计算成本、工作记忆和参数内存的倒置剩余区块,从而利用所有现有资源。由于YoloV2检测头和简单的在线实时跟踪,拟议架构实现了最新结果:(一) 检测COCO和VOC2012基准,以及(二) 跟踪MOT15基准。PhiNets将87%的参数计数减少到93%,而先前的状态模型、工作记忆和参数内存和参数内存的参数计数(Effical Net3),我们用SMBMBM 和SIMB 1 的SMB 系统化成本测试系统,SMLMM 和SIMB1 的S-CML 和SIMB的SDMLML 的系统系统, 和SUDML 的SUDML 和SUDML 的SUDML 的系统, 和SD 的SUDML 和SL 的S 和S-CS-CS-CMDMB的S-CS-CS-CS-CS-CS-CS-CS-CS-CS-CSUDM-CS-CS-CS-CS-C-CM-C-CM-CM-C-CM-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-