This paper presents the accurate, highly efficient, and learning-free method CFEAR Radarodometry for large-scale radar odometry estimation. By using a filtering technique that keeps the k strongest returns per azimuth and by additionally filtering the radar data in Cartesian space, we are able to compute a sparse set of oriented surface points for efficient and accurate scan matching. Registration is carried out by minimizing a point-to-line metric and robustness to outliers is achieved using a Huber loss. We were able to additionally reduce drift by jointly registering the latest scan to a history of keyframes and found that our odometry method generalizes to different sensor models and datasets without changing a single parameter. We evaluate our method in three widely different environments and demonstrate an improvement over spatially cross-validated state-of-the-art with an overall translation error of 1.76% in a public urban radar odometry benchmark, running at 55Hz merely on a single laptop CPU thread.
翻译:本文展示了用于大规模雷达观测估计的准确、高效和无学习的CFEAR雷达测量方法。通过使用使每方位方位返回量保持千分之最强的过滤技术以及进一步过滤卡泰斯空间的雷达数据,我们可以计算出一套分散的定向表层点,以进行有效和准确的扫描匹配。通过最大限度地减少点对线测量方法进行登记,用休格损失实现对外部线的稳健度。通过联合登记对关键框架历史的最新扫描,我们得以进一步减少漂移,并发现我们的观测方法在不改变单一参数的情况下将不同传感器模型和数据集概括为通用。我们评估了三个大不相同的环境中的方法,并展示了在空间交叉有效状态上比全局跨值的艺术的改进情况,在公共城市雷达测量基准中,总翻译误差为1.76%,仅以单部膝上膝上型CPU线的速度在55Hz运行。