LiDAR sensors are a powerful tool for robot simultaneous localization and mapping (SLAM) in unknown environments, but the raw point clouds they produce are dense, computationally expensive to store, and unsuited for direct use by downstream autonomy tasks, such as motion planning. For integration with motion planning, it is desirable for SLAM pipelines to generate lightweight geometric map representations. Such representations are also particularly well-suited for man-made environments, which can often be viewed as a so-called "Manhattan world" built on a Cartesian grid. In this work we present a 3D LiDAR SLAM algorithm for Manhattan world environments which extracts planar features from point clouds to achieve lightweight, real-time localization and mapping. Our approach generates plane-based maps which occupy significantly less memory than their point cloud equivalents, and are suited towards fast collision checking for motion planning. By leveraging the Manhattan world assumption, we target extraction of orthogonal planes to generate maps which are more structured and organized than those of existing plane-based LiDAR SLAM approaches. We demonstrate our approach in the high-fidelity AirSim simulator and in real-world experiments with a ground rover equipped with a Velodyne LiDAR. For both cases, we are able to generate high quality maps and trajectory estimates at a rate matching the sensor rate of 10 Hz.
翻译:LiDAR传感器是机器人在未知环境中同时定位和绘图(SLAM)的强大工具,但是它们产生的原始点云密度大,在计算上昂贵,储存费用昂贵,不适合下游自主任务直接使用,例如运动规划。为了与运动规划相结合,SLAM管道最好能产生轻度的几何地图表示,这种表示也特别适合人造环境,人们往往可以把这种表示看成是建在Cartesian网格上的所谓“曼哈顿世界”。在这项工作中,我们为曼哈顿世界环境提出了一个3D LiDAR SLAM算法,从点云中提取规划功能,以实现轻度、实时本地化和绘图。我们的方法产生基于飞机的地图,其记忆度大大低于其点云等值,适合快速检查运动规划。通过利用曼哈顿世界的假设,我们的目标是提取或多层飞机,绘制比现有基于飞机的LDAR SLMM 方法更结构化和结构更完善的地图。我们展示了我们在高空间-AR质量实验中所使用的方法,在高空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-空间-