Field robotic harvesting is a promising technique in recent development of agricultural industry. It is vital for robots to recognise and localise fruits before the harvesting in natural orchards. However, the workspace of harvesting robots in orchards is complex: many fruits are occluded by branches and leaves. It is important to estimate a proper grasping pose for each fruit before performing the manipulation. In this study, a geometry-aware network, A3N, is proposed to perform end-to-end instance segmentation and grasping estimation using both color and geometry sensory data from a RGB-D camera. Besides, workspace geometry modelling is applied to assist the robotic manipulation. Moreover, we implement a global-to-local scanning strategy, which enables robots to accurately recognise and retrieve fruits in field environments with two consumer-level RGB-D cameras. We also evaluate the accuracy and robustness of proposed network comprehensively in experiments. The experimental results show that A3N achieves 0.873 on instance segmentation accuracy, with an average computation time of 35 ms. The average accuracy of grasping estimation is 0.61 cm and 4.8$^{\circ}$ in centre and orientation, respectively. Overall, the robotic system that utilizes the global-to-local scanning and A3N, achieves success rate of harvesting ranging from 70\% - 85\% in field harvesting experiments.
翻译:实地机器人采集是农业工业近期发展的一个很有希望的技术。对于机器人来说,在自然果园中收获水果之前认识并本地化水果至关重要。然而,在果园中采集机器人的工作空间是复杂的:许多水果被树枝和树叶所覆盖。在操作之前,必须估计每个水果都有适当的捕获形式。在本研究中,一个地理测量-认知网络A3N(A3N)建议利用RGB-D相机的颜色和几何感官数据进行端到端的分解和估计。此外,工作空间几何建模还用于协助机器人操作。此外,我们实施全球到地方的扫描战略,使机器人能够在进行操纵之前准确识别和检索实地环境中的水果。我们还全面评估了拟议网络的准确性和稳健性。 实验结果显示,A3N(A3N)在分解方面实现了0.873的准确度,平均计算时间为35米。 平均精确度估算值是0.61厘米和4.8摄氏3的实地测算法,从全球80摄取率中分别利用A-rass-rass-rass-rass-rass-rasseral 中心和80C-rass-rass-rass-rass-c-rass-rass-rass-c-c-c-ral-ral-rassal-c-c-c-c-c-c-c-c-rassal-rassal-c-c-cal-C-C-cal-cassal-C-cassal-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C