Sparse local feature extraction is usually believed to be of important significance in typical vision tasks such as simultaneous localization and mapping, image matching and 3D reconstruction. At present, it still has some deficiencies needing further improvement, mainly including the discrimination power of extracted local descriptors, the localization accuracy of detected keypoints, and the efficiency of local feature learning. This paper focuses on promoting the currently popular sparse local feature learning with camera pose supervision. Therefore, it pertinently proposes a Shared Coupling-bridge scheme with four light-weight yet effective improvements for weakly-supervised local feature (SCFeat) learning. It mainly contains: i) the \emph{Feature-Fusion-ResUNet Backbone} (F2R-Backbone) for local descriptors learning, ii) a shared coupling-bridge normalization to improve the decoupling training of description network and detection network, iii) an improved detection network with peakiness measurement to detect keypoints and iv) the fundamental matrix error as a reward factor to further optimize feature detection training. Extensive experiments prove that our SCFeat improvement is effective. It could often obtain a state-of-the-art performance on classic image matching and visual localization. In terms of 3D reconstruction, it could still achieve competitive results. For sharing and communication, our source codes are available at https://github.com/sunjiayuanro/SCFeat.git.
翻译:通常认为,局部地物采掘在典型的愿景任务中具有重要意义,例如同时进行本地化和制图、图像匹配和3D重建,目前,仍然有一些需要进一步改进的缺陷,主要包括抽取本地标本的差别力量、检测到的键点的本地化准确性以及本地地物学习效率。本文侧重于促进目前流行的稀疏本地地物学习,同时使用相机进行监管。因此,它提出了共同的结对桥计划,对薄弱、超强的地方特征(SCFeat)学习有四项轻量但有效的改进,主要包括:(一) 本地标本的差别力量(F2R-Backbone),当地标本学习的F2R-Backbone(F2R-Backbone),以及本地地物学效率。本文侧重于促进描述网络和检测网络的脱钩培训,(三) 改进检测网络,用峰值测量,以发现关键点和iv) 基本矩阵错误,作为进一步优化地貌检测培训的奖励因素。进行广泛的实验,以证明我们SCFS-CFS-S-S-S-S-S-S-S-S-S-S-S-S-S-S-A-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-