When performing robot/vehicle localization using ground penetrating radar (GPR) to handle adverse weather and environmental conditions, existing techniques often struggle to accurately estimate distances when processing B-scan images with minor distinctions. This study introduces a new neural network-based odometry method that leverages the similarity and difference features of GPR B-scan images for precise estimation of the Euclidean distances traveled between the B-scan images. The new custom neural network extracts multi-scale features from B-scan images taken at consecutive moments and then determines the Euclidean distance traveled by analyzing the similarities and differences between these features. To evaluate our method, an ablation study and comparison experiments have been conducted using the publicly available CMU-GPR dataset. The experimental results show that our method consistently outperforms state-of-the-art counterparts in all tests. Specifically, our method achieves a root mean square error (RMSE), and achieves an overall weighted RMSE of 0.449 m across all data sets, which is a 10.2\% reduction in RMSE when compared to the best state-of-the-art method.
翻译:在使用探地雷达(GPR)进行机器人/车辆定位以应对恶劣天气和环境条件时,现有技术在处理具有细微差异的B扫描图像时往往难以准确估计距离。本研究提出了一种新的基于神经网络的里程计方法,该方法利用GPR B扫描图像的相似性和差异性特征,精确估计B扫描图像之间行驶的欧几里得距离。新的定制神经网络从连续时刻采集的B扫描图像中提取多尺度特征,然后通过分析这些特征之间的相似性和差异来确定行驶的欧几里得距离。为评估本方法,我们使用公开的CMU-GPR数据集进行了消融研究和对比实验。实验结果表明,在所有测试中,本方法均持续优于现有最先进方法。具体而言,本方法实现了均方根误差(RMSE)的降低,在所有数据集上总体加权RMSE为0.449米,与最佳现有方法相比,RMSE降低了10.2%。