Drawing inspiration from biology, we describe the way in which visual sensing with a monocular camera can provide a reliable signal for navigation of mobile robots. The work takes inspiration from a classic paper by Lee and Reddish (Nature, 1981, https://doi.org/10.1038/293293a0) in which they outline a behavioral strategy pursued by diving sea birds based on a visual cue called time-to-contact. A closely related concept of time-to-transit, tau, is defined, and it is shown that idealized steering laws based on monocular camera perceptions of tau can reliably and robustly steer a mobile vehicle within a wide variety of spaces in which features perceived to lie on walls and other objects in the environment provide adequate visual cues. The contribution of the paper is two-fold. It provides a simple theory of robust vision-based steering control. It goes on to show how the theory guides the implementation of robust visual navigation using ROS-Gazebo simulations as well as deployment and experiments with a camera-equipped Jackal robot. As far as we know, the experiments described below are the first to demonstrate visual navigation based on tau.
翻译:根据生物学的灵感,我们描述了使用单筒照相机进行视觉感测为移动机器人导航提供可靠信号的方式。这项工作从Lee和Reddish的经典论文(自然,1981年,https://doi.org/10.1038/293293a0)中得到灵感,其中概述了以所谓时间对接触的视觉提示为根据的潜水海鸟的行为策略。界定了一个密切相关的时间对时间对时间的感应概念,Tau,并显示基于对Tau的单筒照相机的感知而理想化的引导定律能够可靠和有力地在广泛的空间内驾驶机动车辆,在这些空间内,认为墙壁和其他环境物体上的特征提供了充分的视觉提示。该论文的贡献是两面的。它提供了一种基于视觉的强力方向控制简单理论。它继续展示理论如何指导使用ROS-Gazebo模拟进行强力的视觉导航,以及安装相机的Jackal机器人进行部署和实验。据我们所知,下面所描述的实验是首先展示以图案为基础的视觉导航。