We present an open-source Visual-Inertial-Leg Odometry (VILO) state estimation solution, Cerberus, for legged robots that estimates position precisely on various terrains in real time using a set of standard sensors, including stereo cameras, IMU, joint encoders, and contact sensors. In addition to estimating robot states, we also perform online kinematic parameter calibration and contact outlier rejection to substantially reduce position drift. Hardware experiments in various indoor and outdoor environments validate that calibrating kinematic parameters within the Cerberus can reduce estimation drift to lower than 1% during long distance high speed locomotion. Our drift results are better than any other state estimation method using the same set of sensors reported in the literature. Moreover, our state estimator performs well even when the robot is experiencing large impacts and camera occlusion. The implementation of the state estimator, along with the datasets used to compute our results, are available at https://github.com/ShuoYangRobotics/Cerberus.
翻译:我们提出了一个开放源码的视觉-内分解 Odoraty(VILO)状态估计解决方案(Cerberus),用于使用一套标准传感器(包括立体摄像机、IMU、联合编码器和接触传感器)实时精确估计各种地形位置的腿型机器人。除了估算机器人的状态外,我们还进行在线动能参数校准和接触外线拒绝以大幅降低位置漂移。各种室内和室外环境中的硬件实验证实,Cerberus内部校准运动参数可以在长距离高速移动期间将估计漂移率降低到低于1%。我们的漂移结果优于使用文献中报告的同一一套传感器的任何其他国家估计方法。此外,我们的州测算器即使在机器人正在经历巨大冲击和摄像封闭时也表现良好。在https://github.com/ShuoYangRobotict/Cerberus上可以找到国家测算器,以及用于计算结果的数据集的安装情况,可在https://github. com/ShuoYangRotictic/Cerberus查阅。