Tactile information plays a critical role in human dexterity. It reveals useful contact information that may not be inferred directly from vision. In fact, humans can even perform in-hand dexterous manipulation without using vision. Can we enable the same ability for the multi-finger robot hand? In this paper, we present Touch Dexterity, a new system that can perform in-hand object rotation using only touching without seeing the object. Instead of relying on precise tactile sensing in a small region, we introduce a new system design using dense binary force sensors (touch or no touch) overlaying one side of the whole robot hand (palm, finger links, fingertips). Such a design is low-cost, giving a larger coverage of the object, and minimizing the Sim2Real gap at the same time. We train an in-hand rotation policy using Reinforcement Learning on diverse objects in simulation. Relying on touch-only sensing, we can directly deploy the policy in a real robot hand and rotate novel objects that are not presented in training. Extensive ablations are performed on how tactile information help in-hand manipulation.Our project is available at https://touchdexterity.github.io.
翻译:触觉信息在人类的灵巧性中起着至关重要的作用。 它揭示了有用的接触信息,可能无法直接从视觉中推断出。 实际上,人类甚至可以在不使用视觉的情况下进行手部灵巧操作。 我们能够为多指机器人手提供相同的能力吗? 在本文中,我们提出了 Touch Dexterity,这是一个新的系统,可以通过仅使用触摸而不看到对象来执行手部内的对象旋转。我们采用了新的系统设计,使用位于整个机器人手的一侧(手掌,手指连接处和指尖)的密集二进制力传感器(触摸或不触摸)。这样的设计成本低,可以覆盖更多的对象,并同时最小化 Sim2Real 距离。我们使用强化学习在模拟器中训练手部旋转策略,从而仅依靠触感传感器,就能够直接在真实机器人手上部署该策略并旋转未在训练中出现的新对象。对触觉信息在手部操作中的帮助进行了广泛的剖析。我们的项目可以在 https://touchdexterity.github.io 上获得。