In precision crop protection, (target-orientated) object detection in image processing can help navigate Unmanned Aerial Vehicles (UAV, crop protection drones) to the right place to apply the pesticide. Unnecessary application of non-target areas could be avoided. Deep learning algorithms dominantly use in modern computer vision tasks which require high computing time, memory footprint, and power consumption. Based on the Edge Artificial Intelligence, we investigate the main three paths that lead to dealing with this problem, including hardware accelerators, efficient algorithms, and model compression. Finally, we integrate them and propose a solution based on a light deep neural network (DNN), called Ag-YOLO, which can make the crop protection UAV have the ability to target detection and autonomous operation. This solution is restricted in size, cost, flexible, fast, and energy-effective. The hardware is only 18 grams in weight and 1.5 watts in energy consumption, and the developed DNN model needs only 838 kilobytes of disc space. We tested the developed hardware and software in comparison to the tiny version of the state-of-art YOLOv3 framework, known as YOLOv3-Tiny to detect individual palm in a plantation. An average F1 score of 0.9205 at the speed of 36.5 frames per second (in comparison to similar accuracy at 18 frames per second and 8.66 megabytes of the YOLOv3-Tiny algorithm) was reached. This developed detection system is easily plugged into any machines already purchased as long as the machines have USB ports and run Linux Operating System.
翻译:在精确的作物保护中,图像处理中(以目标为导向)物体探测有助于将无人驾驶飞行器(无人驾驶飞行器、作物保护无人驾驶飞机)运到合适的地点施用农药。可以避免不必要地应用非目标区域。深度学习算法在现代计算机视觉任务中主要使用,需要高计算时间、记忆足迹和电力消耗。根据人工智能,我们调查导致处理这一问题的主要三条路径,包括硬件加速器、高效算法和模型压缩。最后,我们整合它们并提出一个基于轻度深神经网络(DNNN)的解决方案,称为Ag-YOLOO,它能够使作物保护具有目标检测和自主操作的能力。这个解决方案在规模、成本、灵活、记忆足迹和节能方面都受到限制。根据人工智能情报,我们调查导致处理这一问题的主要三条路径,包括硬件加速器、高效算法和模型。我们测试了开发的硬件和软件,以轻度神经神经网络网络网络的微小版本为基础,在亚氏3号平均轨道5号系统上,在亚洛夫·奥夫的高级机械系统上,在亚洛夫·亚氏18级标准(亚洛夫)系统上,已经了解到,在亚洛·奥5号的亚氏标准的快速系统上,已经将研制了。