Autonomous manipulation systems operating in domains where human intervention is difficult or impossible (e.g., underwater, extraterrestrial or hazardous environments) require a high degree of robustness to sensing and communication failures. Crucially, motion planning and control algorithms require a stream of accurate joint angle data provided by joint encoders, the failure of which may result in an unrecoverable loss of functionality. In this paper, we present a novel method for retrieving the joint angles of a robot manipulator using only a single RGB image of its current configuration, opening up an avenue for recovering system functionality when conventional proprioceptive sensing is unavailable. Our approach, based on a distance-geometric representation of the configuration space, exploits the knowledge of a robot's kinematic model with the goal of training a shallow neural network that performs a 2D-to-3D regression of distances associated with detected structural keypoints. It is shown that the resulting Euclidean distance matrix uniquely corresponds to the observed configuration, where joint angles can be recovered via multidimensional scaling and a simple inverse kinematics procedure. We evaluate the performance of our approach on real RGB images of a Franka Emika Panda manipulator, showing that the proposed method is efficient and exhibits solid generalization ability. Furthermore, we show that our method can be easily combined with a dense refinement technique to obtain superior results.
翻译:在人类干预困难或不可能的领域(例如水下、地外或危险环境)运行的自主操纵系统需要高度稳健的感知和通信故障。关键的是,运动规划和控制算法需要由联合编码器提供的准确的联合角度数据流流,其失败可能导致无法恢复功能丧失。在本文件中,我们提出了一个新颖的方法,即仅使用其当前配置的单一 RGB 图像来检索机器人操纵器的联合角度,在常规感知感测器不存在时,开辟了恢复系统功能的渠道。我们的方法基于配置空间的远程测地表示,利用机器人运动模型的动态模型知识,目的是训练一个浅色神经网络,对与检测到的结构关键点相关的距离进行2D-3D回归。我们发现后产生的Eucloidean距离矩阵与所观察到的配置完全吻合,通过多层面的缩放和简单的亲近度组合感测测测,可以恢复联合角度的功能功能。我们基于配置空间的远程测测测测仪,我们利用机器人的感测图模型来展示我们实际的 RGB 的精确度的精确度,我们所采用的方法展示了我们所测测测测得的精确度方法。