Most current works in Sim2Real learning for robotic manipulation tasks leverage camera vision that may be significantly occluded by robot hands during the manipulation. Tactile sensing offers complementary information to vision and can compensate for the information loss caused by the occlusion. However, the use of tactile sensing is restricted in the Sim2Real research due to no simulated tactile sensors being available. To mitigate the gap, we introduce a novel approach for simulating a GelSight tactile sensor in the commonly used Gazebo simulator. Similar to the real GelSight sensor, the simulated sensor can produce high-resolution images by an optical sensor from the interaction between the touched object and an opaque soft membrane. It can indirectly sense forces, geometry, texture and other properties of the object and enables Sim2Real learning with tactile sensing. Preliminary experimental results have shown that the simulated sensor could generate realistic outputs similar to the ones captured by a real GelSight sensor. All the materials used in this paper are available at https://danfergo.github.io/gelsight-simulation.
翻译:在Sim2Real为机器人操作任务进行的Sim2Real学习中,大多数目前的工作都是在操作过程中机器人手可能大大遮蔽的相机视觉。触摸感应为视觉提供了补充信息,并可以弥补隔热造成的信息损失。然而,由于没有模拟触动感应器,在Sim2Real研究中使用触觉是受到限制的。为了缩小差距,我们引入了一种新的方法,在常用的Gazebo模拟器中模拟GelSight触动传感器。与真正的GelSight传感器类似,模拟感应器可以通过被感应对象与不透明的软膜之间的相互作用的光学传感器生成高分辨率图像。它可以间接感应到物体的电力、几何学、纹理学和其他特性,并使得Sim2Real能够用触动感应感应器进行学习。初步实验结果表明,模拟感应器可以产生与真实的GelSight传感器所捕捉到的相近现实的输出结果。本文中使用的所有材料都可以在 https://danfergo.github.io/gelsimligelgel-imsightslamtal上查阅。