Cloth in the real world is often crumpled, self-occluded, or folded in on itself such that key regions, such as corners, are not directly graspable, making manipulation difficult. We propose a system that leverages visual and tactile perception to unfold the cloth via grasping and sliding on edges. By doing so, the robot is able to grasp two adjacent corners, enabling subsequent manipulation tasks like folding or hanging. As components of this system, we develop tactile perception networks that classify whether an edge is grasped and estimate the pose of the edge. We use the edge classification network to supervise a visuotactile edge grasp affordance network that can grasp edges with a 90% success rate. Once an edge is grasped, we demonstrate that the robot can slide along the cloth to the adjacent corner using tactile pose estimation/control in real time. See http://nehasunil.com/visuotactile/visuotactile.html for videos.
翻译:真实世界中的衣物往往被折叠、自我封闭或折叠起来,以至于角等关键区域无法直接掌握,使操纵变得难以操作。 我们建议建立一个系统,利用视觉和触觉感知,通过在边缘上捕捉和滑动来展示布料。 这样,机器人就能抓住两个相邻的角落,从而能够随后完成折叠或吊挂等操作任务。 作为这个系统的组件,我们开发触觉网络,对边缘是否被抓住并估计边缘的形状进行分类。 我们使用边缘分类网络来监督一个可抓住90%成功率的边缘的对面边缘的对齐边缘抓住能力网络。 一旦抓住了边缘,我们就证明机器人可以实时使用触摸姿势估计/控制滑向相邻的角落。 视频见 http://nehasunil.com/visplatecile/visplatecile.html。