Robots operating in real-world environments frequently encounter unknown objects with complex structures and articulated components, such as doors, drawers, cabinets, and tools. The ability to perceive, track, and manipulate these objects without prior knowledge of their geometry or kinematic properties remains a fundamental challenge in robotics. In this work, we present a novel method for visuo-tactile-based tracking of unseen objects (single, multiple, or articulated) during robotic interaction without assuming any prior knowledge regarding object shape or dynamics. Our novel pose tracking approach termed ArtReg (stands for Articulated Registration) integrates visuo-tactile point clouds in an unscented Kalman Filter formulation in the SE(3) Lie Group for point cloud registration. ArtReg is used to detect possible articulated joints in objects using purposeful manipulation maneuvers such as pushing or hold-pulling with a two-robot team. Furthermore, we leverage ArtReg to develop a closed-loop controller for goal-driven manipulation of articulated objects to move the object into the desired pose configuration. We have extensively evaluated our approach on various types of unknown objects through real robot experiments. We also demonstrate the robustness of our method by evaluating objects with varying center of mass, low-light conditions, and with challenging visual backgrounds. Furthermore, we benchmarked our approach on a standard dataset of articulated objects and demonstrated improved performance in terms of pose accuracy compared to state-of-the-art methods. Our experiments indicate that robust and accurate pose tracking leveraging visuo-tactile information enables robots to perceive and interact with unseen complex articulated objects (with revolute or prismatic joints).
翻译:机器人在真实环境中运行时,常会遇到结构复杂且包含铰接部件的未知物体,如门、抽屉、柜子和工具。在缺乏物体几何或运动学先验知识的情况下,感知、跟踪并操作这类物体仍然是机器人学中的一个基础性难题。本研究提出一种新颖方法,能够在机器人交互过程中,基于视觉触觉信息对未知物体(单个、多个或铰接式)进行跟踪,且无需假设任何关于物体形状或动力学的先验知识。我们提出的新颖姿态跟踪方法称为ArtReg(意为铰接配准),该方法将视觉触觉点云整合到SE(3)李群上的无迹卡尔曼滤波框架中,以实现点云配准。ArtReg通过有目的的操作策略(如使用双机器人团队进行推或拉拽)来检测物体中可能存在的铰接关节。此外,我们利用ArtReg开发了一种闭环控制器,用于对铰接物体进行目标驱动的操作,以将物体调整至期望的姿态配置。我们通过真实机器人实验,在多种类型的未知物体上对本方法进行了广泛评估。我们还通过评估具有不同质心、低光照条件及复杂视觉背景的物体,验证了本方法的鲁棒性。此外,我们在一个标准铰接物体数据集上对本方法进行了基准测试,结果表明在姿态精度方面,其性能优于现有先进方法。我们的实验表明,利用视觉触觉信息实现鲁棒且精确的姿态跟踪,能使机器人感知并交互具有旋转或平移关节的未知复杂铰接物体。