Picking up transparent objects is still a challenging task for robots. The visual properties of transparent objects such as reflection and refraction make the current grasping methods that rely on camera sensing fail to detect and localise them. However, humans can handle the transparent object well by first observing its coarse profile and then poking an area of interest to get a fine profile for grasping. Inspired by this, we propose a novel framework of vision-guided tactile poking for transparent objects grasping. In the proposed framework, a segmentation network is first used to predict the horizontal upper regions named as poking regions, where the robot can poke the object to obtain a good tactile reading while leading to minimal disturbance to the object's state. A poke is then performed with a high-resolution GelSight tactile sensor. Given the local profiles improved with the tactile reading, a heuristic grasp is planned for grasping the transparent object. To mitigate the limitations of real-world data collection and labelling for transparent objects, a large-scale realistic synthetic dataset was constructed. Extensive experiments demonstrate that our proposed segmentation network can predict the potential poking region with a high mean Average Precision (mAP) of 0.360, and the vision-guided tactile poking can enhance the grasping success rate significantly from 38.9% to 85.2%. Thanks to its simplicity, our proposed approach could also be adopted by other force or tactile sensors and could be used for grasping of other challenging objects. All the materials used in this paper are available at https://sites.google.com/view/tactilepoking.
翻译:采集透明天体对于机器人来说仍是一项艰巨的任务。 透明天体( 如反射和折射) 的视觉特性使得当前依赖相机感测的掌握方法无法检测和定位它们。 然而, 人类能够很好地处理透明天体, 首先观察其粗糙的剖面, 然后用高分辨率的 GelSight 触动感应器进行操作。 受此启发, 我们提议了一个视觉引导触摸性窥摸透明天体的新框架 。 在提议的框架中, 首先使用一个分解网络来预测以摄取区域命名的横向上层区域, 机器人可以在此区域戳出一个良好的触摸读, 从而获得良好的触觉读取。 然而, 人类可以通过高分辨率的 GelSight 触动感应器来操作透明天体, 计划一个视觉感知性捕捉到透明天体。 为了减轻真实世界数据收集和标签的局限性, 一个大尺度的合成数据集已经构建了。 9 广泛实验显示, 我们提议的精度定位网络使用高清晰度的精度, 也可以预测 AreADLILA 的精度 率率,, 使用其他的精度 Arrevyal 。