This paper introduces DGBench, a fully reproducible open-source testing system to enable benchmarking of dynamic grasping in environments with unpredictable relative motion between robot and object. We use the proposed benchmark to compare several visual perception arrangements. Traditional perception systems developed for static grasping are unable to provide feedback during the final phase of a grasp due to sensor minimum range, occlusion, and a limited field of view. A multi-camera eye-in-hand perception system is presented that has advantages over commonly used camera configurations. We quantitatively evaluate the performance on a real robot with an image-based visual servoing grasp controller and show a significantly improved success rate on a dynamic grasping task.
翻译:本文介绍DGBench,这是一个完全可复制的开放源码测试系统,以便能够在机器人和物体之间具有不可预测的相对运动的环境里对动态捕捉进行基准设定。我们使用拟议的基准来比较几种视觉感知安排。为静态捕捉而开发的传统感知系统,由于传感器最小范围、隐蔽性和有限视野,无法在捕捉的最后阶段提供反馈。我们介绍了一个多摄像头眼视系统,它比常用的照相机配置有优势。我们量化地评价一个真正机器人的性能,该机器人具有基于图像的视觉感知控制器,在动态捕捉任务上显示显著提高的成功率。