Industrial Internet of Things (IIoT) applications can benefit from leveraging edge computing. For example, applications underpinned by deep neural networks (DNN) models can be sliced and distributed across the IIoT device and the edge of the network for improving the overall performance of inference and for enhancing privacy of the input data, such as industrial product images. However, low network performance between IIoT devices and the edge is often a bottleneck. In this study, we develop ScissionLite, a holistic framework for accelerating distributed DNN inference using the Transfer Layer (TL). The TL is a traffic-aware layer inserted between the optimal slicing point of a DNN model slice in order to decrease the outbound network traffic without a significant accuracy drop. For the TL, we implement a new lightweight down/upsampling network for performance-limited IIoT devices. In ScissionLite, we develop ScissionTL, the Preprocessor, and the Offloader for end-to-end activities for deploying DNN slices with the TL. They decide the optimal slicing point of the DNN, prepare pre-trained DNN slices including the TL, and execute the DNN slices on an IIoT device and the edge. Employing the TL for the sliced DNN models has a negligible overhead. ScissionLite improves the inference latency by up to 16 and 2.8 times when compared to execution on the local device and an existing state-of-the-art model slicing approach respectively.
翻译:例如,由深神经网络(DNN)模型支撑的应用程序可以在IIOT设备的最佳切片点和网络边缘进行切片并分布,以便改善整个推断性能,提高输入数据的隐私性,例如工业产品图像。然而,IIOT设备和边缘之间的网络性能低往往是一个瓶颈。在本研究中,我们开发了剪切Lite,这是一个利用传输层(TL)加快分布式 DNNT推断的全面框架。TL是一个交通通量层,插在DNNN模型切片的最佳切片点之间,以便在没有显著精确性下降的情况下减少流出网络的流量。对于TL,我们为性能有限的IIOT设备实施一个新的轻度下行/增压网络。在ScissionLite中,我们开发了ScissionTL、Premacororororal和Ofloaderal活动,在将DNNNPS的平面处和DNFS前的S-L,在DNF的平面上,在S-DNUR的平面上,在S-DNUR的S-BS-BS-L中,在S-BS-BS-BS-BS-BS-BS-BS-BS-S-S-S-S-BS-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-B-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S