Grasping objects whose physical properties are unknown is still a great challenge in robotics. Most solutions rely entirely on visual data to plan the best grasping strategy. However, to match human abilities and be able to reliably pick and hold unknown objects, the integration of an artificial sense of touch in robotic systems is pivotal. This paper describes a novel model-based slip detection pipeline that can predict possibly failing grasps in real-time and signal a necessary increase in grip force. As such, the slip detector does not rely on manually collected data, but exploits physics to generalize across different tasks. To evaluate the approach, a state-of-the-art vision-based tactile sensor that accurately estimates distributed forces was integrated into a grasping setup composed of a six degrees-of-freedom cobot and a two-finger gripper. Results show that the system can reliably predict slip while manipulating objects of different shapes, materials, and weights. The sensor can detect both translational and rotational slip in various scenarios, making it suitable to improve the stability of a grasp.
翻译:物理特性未知的切除对象在机器人中仍是一个巨大的挑战。 大多数解决方案都完全依靠视觉数据来规划最佳捕捉策略。 但是,为了匹配人的能力,并能够可靠地挑选和持有未知对象,将人工触摸感融入机器人系统至关重要。本文描述了一个新的模型式滑动探测管道,它可以预测实时捕捉可能失败,并预示着控制力的必要增强。 因此, 滑动探测器并不依靠手动收集的数据,而是利用物理将不同任务加以概括。 为了评估这种方法, 一种最先进的基于视觉的触动传感器, 准确估计分布的能量被整合到一个由六度自由可触控波和两指牵控器组成的抓抓装置中。 结果表明, 该系统可以在操纵不同形状、 材料 和重量 的物体时可靠地预测滑动。 传感器可以在各种情景中检测翻译和旋转滑落, 从而适合改进捕捉的稳定性 。