We propose NNStreamer, a software system that handles neural networks as filters of stream pipelines, applying the stream processing paradigm to deep neural network applications. A new trend with the wide-spread of deep neural network applications is on-device AI. It is to process neural networks on mobile devices or edge/IoT devices instead of cloud servers. Emerging privacy issues, data transmission costs, and operational costs signify the need for on-device AI, especially if we deploy a massive number of devices. NNStreamer efficiently handles neural networks with complex data stream pipelines on devices, significantly improving the overall performance with minimal efforts. Besides, NNStreamer simplifies implementations and allows reusing off-the-shelf media filters directly, which reduces developmental costs significantly. We are already deploying NNStreamer for a wide range of products and platforms, including the Galaxy series and various consumer electronic devices. The experimental results suggest a reduction in developmental costs and enhanced performance of pipeline architectures and NNStreamer. It is an open-source project incubated by Linux Foundation AI, available to the public and applicable to various hardware and software platforms.
翻译:我们提议采用NNStreamer软件系统,将神经网络作为流管过滤器处理神经网络,将流处理模式应用于深神经网络应用; 深度神经网络应用的广度的新趋势是全新装置; 将处理移动设备或边缘/IoT设备上神经网络而不是云服务器上神经网络; 新出现的隐私问题、数据传输成本和运营成本表明需要安装自动智能系统,特别是如果我们部署大量设备。 NNStreamer高效处理设备上复杂数据流管道的神经网络,大大改进总体性能,并尽量减少努力。 此外, NNStreamer简化了实施程序,允许直接使用现成的媒体过滤器,这大大降低了开发成本。 我们已经为广泛的产品和平台,包括银河系列和各种消费者电子设备部署NNSreamer。 实验结果表明,开发成本降低,管道结构和NNStreamer系统结构的性能得到提高,大大改进了总体性能。 NNSreamer是一个开放源项目,由可应用的Linux AI 基础软件和公共硬件平台进行。