Object detection problem solving has developed greatly within the past few years. There is a need for lighter models in instances where hardware limitations exist, as well as a demand for models to be tailored to mobile devices. In this article, we will assess the methods used when creating algorithms that address these issues. The main goal of this article is to increase accuracy in state-of-the-art algorithms while maintaining speed and real-time efficiency. The most significant issues in one-stage object detection pertains to small objects and inaccurate localization. As a solution, we created a new network by the name of MobileDenseNet suitable for embedded systems. We also developed a light neck FCPNLite for mobile devices that will aid with the detection of small objects. Our research revealed that very few papers cited necks in embedded systems. What differentiates our network from others is our use of concatenation features. A small yet significant change to the head of the network amplified accuracy without increasing speed or limiting parameters. In short, our focus on the challenging CoCo and Pascal VOC datasets were 24.8 and 76.8 in percentage terms respectively - a rate higher than that recorded by other state-of-the-art systems thus far. Our network is able to increase accuracy while maintaining real-time efficiency on mobile devices. We calculated operational speed on Pixel 3 (Snapdragon 845) to 22.8 fps. The source code of this research is available on https://github.com/hajizadeh/MobileDenseNet.
翻译:过去几年来,物体探测问题的解决有了很大的发展。 在硬件限制存在的情况下,需要较轻的模型,以及需要为移动设备量身定制模型。 在本条中,我们将评估在创建处理这些问题的算法时所使用的方法。 本条的主要目的是提高最先进的算法的准确性,同时保持速度和实时效率。 单阶段物体探测中最重要的问题涉及小物体和不准确的本地化。 作为一种解决办法,我们创建了一个适合嵌入系统的名为MoveDenseNet的新网络。 我们还为移动设备开发了一个轻脖子FCPNLite, 这将有助于探测小物体。 我们的研究显示,在嵌入系统中,很少的文件引用了颈部。 我们网络与其他人的区别在于我们使用凝固特性。 网络头部的微小但显著的改变,在不增加速度或限制参数。 简而言之,我们对具有挑战性的Coo 和Pascal VOC 数据集的关注度分别为24.8和76.8 %。 我们的移动设备运行速度比实际的Pax- sal-deal-deal-deal-deal systemal systemal systemal- sal- sal sal-hal- sal sal- sal-xnal- sal-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx。