Tactile perception is essential for real-world manipulation tasks, yet the high cost and fragility of tactile sensors can limit their practicality. In this work, we explore BeadSight (a low-cost, open-source tactile sensor) alongside a tactile pre-training approach, an alternative method to precise, pre-calibrated sensors. By pre-training with the tactile sensor and then disabling it during downstream tasks, we aim to enhance robustness and reduce costs in manipulation systems. We investigate whether tactile pre-training, even with a low-fidelity sensor like BeadSight, can improve the performance of an imitation learning agent on complex manipulation tasks. Through visuo-tactile pre-training on both similar and dissimilar tasks, we analyze its impact on a longer-horizon downstream task. Our experiments show that visuo-tactile pre-training improved performance on a USB cable plugging task by up to 65% with vision-only inference. Additionally, on a longer-horizon drawer pick-and-place task, pre-training--whether on a similar, dissimilar, or identical task--consistently improved performance, highlighting the potential for a large-scale visuo-tactile pre-trained encoder.
翻译:暂无翻译