Augmented reality (AR) using camera images in mobile devices is becoming popular for tourism promotion. However, obstructions such as tourists appearing in the camera images may cause the camera pose estimation error, resulting in CG misalignment and reduced visibility of the contents. To avoid this problem, Indirect AR (IAR), which does not use real-time camera images, has been proposed. In this method, an omnidirectional image is captured and virtual objects are synthesized on the image in advance. Users can experience AR by viewing a scene extracted from the synthesized omnidirectional image according to the device's sensor. This enables robustness and high visibility. However, if the weather conditions and season in the pre-captured 360 images differs from the current weather conditions and season when AR is experienced, the realism of the AR experience is reduced. To overcome the problem, we propose a method for correcting the intensity and texture of a past omnidirectional image using camera images from mobile devices. We first perform semantic segmentation. We then reproduce the current sky pattern by panoramic image composition and inpainting. For the other areas, we correct the intensity by histogram matching. In experiments, we show the effectiveness of the proposed method using various scenes.
翻译:暂无翻译