Implicit neural networks (INNs) are very effective for learning data representation. However, most INNs inevitably generate over-smoothed patches or obvious noisy artifacts in the results when the data has many scales of details or a wide range of frequencies, leading to significant performance reduction. Adapting the result containing both noise and over-smoothed regions usually suffers from either over smoothing or noisy issues. To overcome this challenge, we propose a new framework, coined FINN, that integrated a \emph{filtering} module to the \emph{implicit neural network} to perform data fitting while filtering artifacts. The filtering module has a smoothing operator that acts on the intermediate results of the network and a recovering operator that brings distinct details from the input back to the regions overly smoothed. The proposed method significantly alleviates over smoothing or noisy issues. We demonstrate the advantage of the FINN on the image regression task, considering both real and synthetic images, and showcases significant improvement on both quantitative and qualitative results compared to state-of-the-art methods. Moreover, FINN yields better performance in both convergence speed and network stability. Source code is available at https://github.com/yixin26/FINN.
翻译:内含噪音的神经网络(INNs)对于学习数据代表性非常有效。然而,大多数INNs在数据具有许多详细程度或频度大,导致性能显著下降的情况下,结果中必然会产生过度移动的补丁或明显的噪音制品。对含有噪音和过度拥挤区域的结果进行调整通常会造成过于平滑或吵闹的问题。为了克服这一挑战,我们提议了一个新的框架,即FINN, 将一个模版并入\emph{filectrate} 神经网络,以便在过滤文物时进行数据安装。过滤模块有一个平稳操作,对网络的中间结果采取行动,并恢复操作,将投入的不同细节带回到过于平滑的地区。拟议的方法大大缓解了平滑或吵闹的问题。我们从真实和合成图像的角度展示了FINN在图像回归任务上的优势,并展示了与州-艺术方法相比,定量和定性结果的显著改进。此外,FINN/FINS/FINS在可使用的速度/FINCRA中生成的更好表现。