Distance estimation is fundamental for a variety of robotic applications including navigation, manipulation and planning. Inspired by the mammal's visual system, which gazes at specific objects (active fixation), and estimates when the object will reach it (time-to-contact), we develop a novel constraint between time-to-contact, acceleration, and distance that we call the $\tau$-constraint. It allows an active monocular camera to estimate depth using time-to-contact and inertial measurements (linear accelerations and angular velocities) within a window of time. Our work differs from other approaches by focusing on patches instead of feature points. This is, because the change in the patch area determines the time-to-contact directly. The result enables efficient estimation of distance while using only a small portion of the image, leading to a large speedup. We successfully validate the proposed $\tau$-constraint in the application of estimating camera position with a monocular grayscale camera and an Inertial Measurement Unit (IMU). Specifically, we test our method on different real-world planar objects over trajectories 8-40 seconds in duration and 7-35 meters long. Our method achieves 8.5 cm Average Trajectory Error (ATE) while the popular Visual-Inertial Odometry methods VINS-Mono and ROVIO achieve 12.2 and 16.9 cm ATE respectively. Additionally, our implementation runs 27$\times$ faster than VINS-Mono's and 6.8$\times$ faster than ROVIO's. We believe these results indicate the $\tau$-constraints potential to be the basis of robust, sophisticated algorithms for a multitude of applications involving an active camera and an IMU.
翻译:对包括导航、操纵和规划在内的各种机器人应用而言,距离估算是基本。受哺乳动物视觉系统的启发,我们的工作与其他方法不同,我们注重的是补丁而不是特征点。这是因为补丁区的变化决定了时间到接触,我们开发了我们称之为$tau$- constrainint的时间到接触、加速和距离之间的新限制。我们成功地验证了在应用时间到接触和惯性测量(线性加速和角速率)时使用时间到接触和惯性测量(线性加速和角速度)来估计深度。具体地说,我们的工作与其他方法不同,我们注重的是补丁而不是特征点。这是因为补丁区的变化直接决定了时间到接触的距离。结果使我们能够有效估计距离,同时仅使用图像的一小部分,导致一个大的加速。我们成功地验证了在应用光度和惯性测量摄像位置时使用的$-cloal-leacal cravelopoal-mological-al-mologyal-al-modeal-modia-ral-mayal-modeal-modeal-ral-ral-ral-modeal-ral-modeal-ral-modeal-modeal-ral-ral-modeal-masal-modeal-moxal-moal-moal-moal-al-moxal-moal-al-al-al-al-al-al-al-al-al-moxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxI-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-