Robot simulation has been an essential tool for data-driven manipulation tasks. However, most existing simulation frameworks lack either efficient and accurate models of physical interactions with tactile sensors or realistic tactile simulation. This makes the sim-to-real transfer for tactile-based manipulation tasks still challenging. In this work, we integrate simulation of robot dynamics and vision-based tactile sensors by modeling the physics of contact. This contact model uses simulated contact forces at the robot's end-effector to inform the generation of realistic tactile outputs. To eliminate the sim-to-real transfer gap, we calibrate our physics simulator of robot dynamics, contact model, and tactile optical simulator with real-world data, and then we demonstrate the effectiveness of our system on a zero-shot sim-to-real grasp stability prediction task where we achieve an average accuracy of 90.7% on various objects. Experiments reveal the potential of applying our simulation framework to more complicated manipulation tasks. We open-source our simulation framework at https://github.com/CMURoboTouch/Taxim/tree/taxim-robot.
翻译:机器人模拟是数据驱动操作任务的基本工具。 然而, 多数现有的模拟框架要么缺乏与触觉传感器或现实的触觉模拟进行物理互动的有效和准确模型。 这使得基于触觉的操控任务模拟到真实的传输仍然具有挑战性。 在这项工作中, 我们通过模拟接触物理学, 将机器人动态和基于视觉的触动传感器的模拟整合为一体。 这个接触模型在机器人的终端效应器上使用模拟的接触力为产生现实的触动输出提供信息。 为了消除模拟到真实的传输差距, 我们用真实的数据校准我们的物理模拟器、 接触模型和触动光学模拟器, 然后我们展示我们的系统在零射线模拟到真实的捕捉稳定预测任务上的有效性, 我们在那里实现平均90.7%的精确度。 实验揭示了将我们的模拟框架应用于更复杂的操控任务的可能性。 我们在 https://github.com/ CMUROOOU/TASIM/TARimM/TRATION/TRATIATIA) 的模拟框架。