Night-Time Scene Parsing (NTSP) is essential to many vision applications, especially for autonomous driving. Most of the existing methods are proposed for day-time scene parsing. They rely on modeling pixel intensity-based spatial contextual cues under even illumination. Hence, these methods do not perform well in night-time scenes as such spatial contextual cues are buried in the over-/under-exposed regions in night-time scenes. In this paper, we first conduct an image frequency-based statistical experiment to interpret the day-time and night-time scene discrepancies. We find that image frequency distributions differ significantly between day-time and night-time scenes, and understanding such frequency distributions is critical to NTSP problem. Based on this, we propose to exploit the image frequency distributions for night-time scene parsing. First, we propose a Learnable Frequency Encoder (LFE) to model the relationship between different frequency coefficients to measure all frequency components dynamically. Second, we propose a Spatial Frequency Fusion module (SFF) that fuses both spatial and frequency information to guide the extraction of spatial context features. Extensive experiments show that our method performs favorably against the state-of-the-art methods on the NightCity, NightCity+ and BDD100K-night datasets. In addition, we demonstrate that our method can be applied to existing day-time scene parsing methods and boost their performance on night-time scenes.
翻译:夜景剖析( NTSP ) 在许多视觉应用中至关重要, 特别是自主驱动。 大部分现有方法都是为日间场景剖析而提出的。 它们依赖模拟像素强度基空间背景提示, 甚至在照明下也使用。 因此, 这些方法在夜间场景中效果不好, 因为空间背景提示被埋在夜间场景中过度/ 曝光不足的区域。 在本文中, 我们首先进行基于图像频率的统计实验, 以解释白天和夜间场景的差异。 我们发现, 白天和夜间场景之间的图像频率分布差异很大, 并且理解这种频率分布对于NTSP 问题至关重要。 基于此, 我们提议利用夜间场面的图像频率分布。 首先, 我们提议使用一个可学习频 Enable Encoder (LFE) 来模拟不同频率系数之间的关系, 以动态测量所有频率组成部分。 其次, 我们提议一个空间频度拆解模块, 将空间和频率信息结合到空间- 夜景场景场景场景分布对 NED- 测试方式的定位, 显示我们空间- DD- 和 B 的当前空间- 测试方法。 展示 B- 展示 B- 显示 B- sal- preal- preal- protime- protimeal- protime- proal- proal- prog- sal- progal- progymal- sal- sal- sal- progy laction- sal- sal- sal- sal laction- progy laction- progy- progy- progy- sal- progy- pal- sal- sal- sal- sal- sal- progy- s- sal- pal- pro- sal- sal- sal- sal- sal- progy- progal- progal- sal- progy- sal- laction- sal- sal- sal- dal- s- s- sal- sal- sal- sal- sal- sal- sal- dal- sal- d- s-