Tactile predictive models can be useful across several robotic manipulation tasks, e.g. robotic pushing, robotic grasping, slip avoidance, and in-hand manipulation. However, available tactile prediction models are mostly studied for image-based tactile sensors and there is no comparison study indicating the best performing models. In this paper, we presented two novel data-driven action-conditioned models for predicting tactile signals during real-world physical robot interaction tasks (1) action condition tactile prediction and (2) action conditioned tactile-video prediction models. We use a magnetic-based tactile sensor that is challenging to analyse and test state-of-the-art predictive models and the only existing bespoke tactile prediction model. We compare the performance of these models with those of our proposed models. We perform the comparison study using our novel tactile enabled dataset containing 51,000 tactile frames of a real-world robotic manipulation task with 11 flat-surfaced household objects. Our experimental results demonstrate the superiority of our proposed tactile prediction models in terms of qualitative, quantitative and slip prediction scores.
翻译:电动预测模型可以适用于若干机器人操纵任务,例如机器人推力、机器人抓捉、避滑和手动操纵等。然而,现有触动预测模型主要用于基于图像的触动传感器,没有比较研究显示最佳的模型。在本文中,我们提出了两个新的数据驱动行动附加条件模型,用于在现实世界物理机器人互动任务期间预测触动信号 (1) 动作状态触动预测,以及(2) 动作调节触动视频预测模型。我们使用了一种基于磁感应传感器,该传感器对分析和测试最新预测模型和唯一现有的单式触动预测模型具有挑战性。我们将这些模型的性能与我们拟议模型的性能进行比较研究。我们使用我们的新动能使数据集进行了比较研究,其中包含了51 000个实时机器人操纵任务触动框架和11个平面家用天体物体。我们的实验结果显示我们提议的触动预测模型在定性、定量和滑动预测分数方面的优越性。