We introduce a camera relocalization pipeline that combines absolute pose regression (APR) and direct feature matching. By incorporating exposure-adaptive novel view synthesis, our method successfully addresses photometric distortions in outdoor environments that existing photometric-based methods fail to handle. With domain-invariant feature matching, our solution improves pose regression accuracy using semi-supervised learning on unlabeled data. In particular, the pipeline consists of two components: Novel View Synthesizer and DFNet. The former synthesizes novel views compensating for changes in exposure and the latter regresses camera poses and extracts robust features that close the domain gap between real images and synthetic ones. Furthermore, we introduce an online synthetic data generation scheme. We show that these approaches effectively enhance camera pose estimation both in indoor and outdoor scenes. Hence, our method achieves a state-of-the-art accuracy by outperforming existing single-image APR methods by as much as 56%, comparable to 3D structure-based methods.
翻译:我们引入了一个照相机重新定位管道,将绝对的回归(ARP)和直接特征匹配结合起来。通过整合暴露适应性新观点合成,我们的方法成功地解决了室外环境中现有光度测量方法无法处理的光度扭曲问题。随着域异差特征匹配,我们的解决方案通过在未贴标签数据上进行半监督的学习而提高了回归精度。特别是,管道由两个部分组成:新视图合成器和DFNet。前者综合了新观点,以弥补暴露的变化,而后者则采用并提取了弥合真实图像和合成图像之间域间差距的强健特征。此外,我们引入了在线合成数据生成计划。我们展示了这些方法有效地增强相机在室内和室外场面都进行估计。因此,我们的方法通过比现有的单一图像APR方法高出高达56%,相当于3D结构方法。