We present an efficient frequency-based neural representation termed PREF: a shallow MLP augmented with a phasor volume that covers significant border spectra than previous Fourier feature mapping or Positional Encoding. At the core is our compact 3D phasor volume where frequencies distribute uniformly along a 2D plane and dilate along a 1D axis. To this end, we develop a tailored and efficient Fourier transform that combines both Fast Fourier transform and local interpolation to accelerate na\"ive Fourier mapping. We also introduce a Parsvel regularizer that stables frequency-based learning. In these ways, Our PREF reduces the costly MLP in the frequency-based representation, thereby significantly closing the efficiency gap between it and other hybrid representations, and improving its interpretability. Comprehensive experiments demonstrate that our PREF is able to capture high-frequency details while remaining compact and robust, including 2D image generalization, 3D signed distance function regression and 5D neural radiance field reconstruction.
翻译:我们展示了一个高效的频率神经代表,称为PREF:一个浅薄的 MLP, 配有比以前的 Fourier 地貌映射或定位编码重要边界光谱的散射量。 核心是我们的3D光谱缩放量, 频率沿2D平面统一分布, 并沿着1D轴扩展。 为此, 我们开发了一个量身定制的高效Fourier变形, 将快速的 Fourier 变换和本地的内插结合起来, 以加速“ 移动 Fourier ” 映射。 我们还引入了一种可稳定频率学习的 Parsvel 常规化器。 通过这些方式, 我们的PREF 降低了基于频率的昂贵 MLP, 从而大大缩小了它与其他混合表层之间的效率差距, 并改进了它的可解释性。 全面实验表明, 我们的PREF 能够捕捉到高频细节, 同时保持紧凑, 包括 2D 图像一般化、 3D 签名的距离函数回归和 5D 神经光谱场重建 。