Performing accurate localization while maintaining the low-level communication bandwidth is an essential challenge of multi-robot simultaneous localization and mapping (MR-SLAM). In this paper, we tackle this problem by generating a compact yet discriminative feature descriptor with minimum inference time. We propose descriptor distillation that formulates the descriptor generation into a learning problem under the teacher-student framework. To achieve real-time descriptor generation, we design a compact student network and learn it by transferring the knowledge from a pre-trained large teacher model. To reduce the descriptor dimensions from the teacher to the student, we propose a novel loss function that enables the knowledge transfer between two different dimensional descriptors. The experimental results demonstrate that our model is 30% lighter than the state-of-the-art model and produces better descriptors in patch matching. Moreover, we build a MR-SLAM system based on the proposed method and show that our descriptor distillation can achieve higher localization performance for MR-SLAM with lower bandwidth.
翻译:在维护低水平通信带宽的同时进行准确本地化,是多机器人同步本地化和绘图(MR-SLAM)的一项基本挑战。 在本文中,我们通过生成一个具有最小推引时间的紧凑但歧视性特征描述符来解决这个问题。 我们提议在教师-学生框架内进行描述性蒸馏,将描述性生成形成一个学习问题。 为了实现实时描述性生成, 我们设计了一个紧凑的学生网络, 并通过将知识从受过培训的大型教师模型中传输来学习它。 为了减少教师对学生的描述性能, 我们提议了一个新的损失功能, 使两个不同的维描述符之间能够进行知识传输。 实验结果显示, 我们的模式比最先进的模型轻30%, 并产生更好的匹配的描述性能。 此外, 我们根据拟议的方法建立了一个MS- SLAM系统, 并显示我们的描述性描述性可提高MR- SLAM 带宽度带更低的本地化性能。</s>